Oct 06 08:22:21 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 08:22:22 crc restorecon[4750]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:22 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:22:23 crc restorecon[4750]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 08:22:23 crc kubenswrapper[4755]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.605999 4755 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612938 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612976 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612981 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612985 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612989 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612994 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.612999 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613004 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613010 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613015 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613021 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613025 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613029 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613033 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613036 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613040 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613044 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613053 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613059 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613063 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613067 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613071 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613075 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613079 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613083 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613086 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613090 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613093 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613097 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613100 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613105 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613108 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613112 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613115 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613119 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613122 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613126 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613131 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613136 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613140 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613143 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613147 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613152 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613156 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613159 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613163 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613166 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613170 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613173 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613177 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613180 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613184 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613188 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613193 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613197 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613201 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613204 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613208 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613211 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613215 4755 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613218 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613222 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613225 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613229 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613232 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613236 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613239 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613243 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613246 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613250 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.613253 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614183 4755 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614200 4755 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614209 4755 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614215 4755 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614222 4755 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614227 4755 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614234 4755 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614240 4755 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614244 4755 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614248 4755 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614252 4755 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614256 4755 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614261 4755 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614265 4755 flags.go:64] FLAG: --cgroup-root="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614269 4755 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614273 4755 flags.go:64] FLAG: --client-ca-file="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614277 4755 flags.go:64] FLAG: --cloud-config="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614281 4755 flags.go:64] FLAG: --cloud-provider="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614287 4755 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614293 4755 flags.go:64] FLAG: --cluster-domain="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614297 4755 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614301 4755 flags.go:64] FLAG: --config-dir="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614305 4755 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614310 4755 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614315 4755 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614320 4755 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614324 4755 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614328 4755 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614333 4755 flags.go:64] FLAG: --contention-profiling="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614337 4755 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614342 4755 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614348 4755 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614352 4755 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614358 4755 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614362 4755 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614366 4755 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614371 4755 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614375 4755 flags.go:64] FLAG: --enable-server="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614379 4755 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614384 4755 flags.go:64] FLAG: --event-burst="100" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614389 4755 flags.go:64] FLAG: --event-qps="50" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614393 4755 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614397 4755 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614401 4755 flags.go:64] FLAG: --eviction-hard="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614406 4755 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614410 4755 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614414 4755 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614420 4755 flags.go:64] FLAG: --eviction-soft="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614424 4755 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614428 4755 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614432 4755 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614436 4755 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614441 4755 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614445 4755 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614450 4755 flags.go:64] FLAG: --feature-gates="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614455 4755 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614459 4755 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614463 4755 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614468 4755 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614473 4755 flags.go:64] FLAG: --healthz-port="10248" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614478 4755 flags.go:64] FLAG: --help="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614482 4755 flags.go:64] FLAG: --hostname-override="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614486 4755 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614490 4755 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614494 4755 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614499 4755 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614503 4755 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614508 4755 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614512 4755 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614516 4755 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614520 4755 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614525 4755 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614529 4755 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614533 4755 flags.go:64] FLAG: --kube-reserved="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614537 4755 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614541 4755 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614546 4755 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614550 4755 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614554 4755 flags.go:64] FLAG: --lock-file="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614558 4755 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614577 4755 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614582 4755 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614588 4755 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614592 4755 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614596 4755 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614600 4755 flags.go:64] FLAG: --logging-format="text" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614605 4755 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614609 4755 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614613 4755 flags.go:64] FLAG: --manifest-url="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614617 4755 flags.go:64] FLAG: --manifest-url-header="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614624 4755 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614629 4755 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614634 4755 flags.go:64] FLAG: --max-pods="110" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614638 4755 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614643 4755 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614647 4755 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614651 4755 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614655 4755 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614659 4755 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614665 4755 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614677 4755 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614681 4755 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614685 4755 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614690 4755 flags.go:64] FLAG: --pod-cidr="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614694 4755 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614700 4755 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614705 4755 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614709 4755 flags.go:64] FLAG: --pods-per-core="0" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614714 4755 flags.go:64] FLAG: --port="10250" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614718 4755 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614723 4755 flags.go:64] FLAG: --provider-id="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614727 4755 flags.go:64] FLAG: --qos-reserved="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614732 4755 flags.go:64] FLAG: --read-only-port="10255" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614736 4755 flags.go:64] FLAG: --register-node="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614740 4755 flags.go:64] FLAG: --register-schedulable="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614745 4755 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614753 4755 flags.go:64] FLAG: --registry-burst="10" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614757 4755 flags.go:64] FLAG: --registry-qps="5" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614761 4755 flags.go:64] FLAG: --reserved-cpus="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614766 4755 flags.go:64] FLAG: --reserved-memory="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614771 4755 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614776 4755 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614780 4755 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614784 4755 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614787 4755 flags.go:64] FLAG: --runonce="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614791 4755 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614796 4755 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614801 4755 flags.go:64] FLAG: --seccomp-default="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614805 4755 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614809 4755 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614813 4755 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614817 4755 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614822 4755 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614826 4755 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614831 4755 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614835 4755 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614839 4755 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614844 4755 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614849 4755 flags.go:64] FLAG: --system-cgroups="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614853 4755 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614860 4755 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614864 4755 flags.go:64] FLAG: --tls-cert-file="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614868 4755 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614873 4755 flags.go:64] FLAG: --tls-min-version="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614877 4755 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614881 4755 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614885 4755 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614889 4755 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614893 4755 flags.go:64] FLAG: --v="2" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614899 4755 flags.go:64] FLAG: --version="false" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614905 4755 flags.go:64] FLAG: --vmodule="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614918 4755 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.614922 4755 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615031 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615036 4755 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615040 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615044 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615048 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615052 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615056 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615060 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615064 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615068 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615073 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615077 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615080 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615084 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615088 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615092 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615096 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615100 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615104 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615107 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615111 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615115 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615118 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615122 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615126 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615129 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615133 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615137 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615140 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615145 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615149 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615153 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615157 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615161 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615186 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615191 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615196 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615199 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615203 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615207 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615211 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615215 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615218 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615222 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615226 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615230 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615235 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615239 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615242 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615246 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615249 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615253 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615256 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615260 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615263 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615266 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615270 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615273 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615277 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615280 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615283 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615287 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615290 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615295 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615300 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615304 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615308 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615312 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615315 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615319 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.615322 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.615336 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.628220 4755 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.628396 4755 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629244 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629272 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629282 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629292 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629301 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629308 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629315 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629320 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629326 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629331 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629336 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629341 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629346 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629353 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629359 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629364 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629369 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629374 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629380 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629385 4755 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629391 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629396 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629401 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629406 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629411 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629416 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629420 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629425 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629431 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629437 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629442 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629447 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629451 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629459 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629464 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629469 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629474 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629481 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629487 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629493 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629497 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629502 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629507 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629513 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629517 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629523 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629528 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629532 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629538 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629543 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629548 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629553 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629577 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629583 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629590 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629597 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629602 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629608 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629614 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629619 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629624 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629630 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629635 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629640 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629645 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629651 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629656 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629660 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629665 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629670 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629675 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.629684 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629921 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629938 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629947 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629956 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629965 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629975 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629983 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.629992 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630000 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630009 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630017 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630025 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630034 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630042 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630051 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630060 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630068 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630076 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630085 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630094 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630102 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630111 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630119 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630130 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630142 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630153 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630163 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630172 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630181 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630190 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630198 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630207 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630216 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630224 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630233 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630241 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630250 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630260 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630268 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630279 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630290 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630301 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630310 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630318 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630327 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630336 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630346 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630355 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630363 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630373 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630383 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630392 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630400 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630408 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630416 4755 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630425 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630436 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630447 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630456 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630464 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630473 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630482 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630490 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630498 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630507 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630516 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630525 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630533 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630541 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630550 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.630592 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.630605 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.631749 4755 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.638252 4755 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.638401 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.640606 4755 server.go:997] "Starting client certificate rotation" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.640657 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.640971 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 16:26:32.443189306 +0000 UTC Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.641118 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1304h4m8.802076948s for next certificate rotation Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.673310 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.677112 4755 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.698943 4755 log.go:25] "Validated CRI v1 runtime API" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.740780 4755 log.go:25] "Validated CRI v1 image API" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.743478 4755 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.751707 4755 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-08-09-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.751767 4755 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.787125 4755 manager.go:217] Machine: {Timestamp:2025-10-06 08:22:23.782224961 +0000 UTC m=+0.611540275 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ec918f86-fe57-44c4-9b07-fa73cce83870 BootID:699772fe-1bda-4c36-8c0f-3619ae33584c Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:cd:51:11 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:cd:51:11 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3e:e9:c8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:29:b0:45 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6b:bb:05 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8d:a4:9b Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:16:be:7a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:8d:ce:4d:91:c1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5e:8f:c1:af:90:3f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.787480 4755 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.787860 4755 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.789106 4755 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.789357 4755 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.789404 4755 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.791234 4755 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.791296 4755 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.791890 4755 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.791942 4755 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.792187 4755 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.792680 4755 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.796356 4755 kubelet.go:418] "Attempting to sync node with API server" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.796399 4755 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.796431 4755 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.796455 4755 kubelet.go:324] "Adding apiserver pod source" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.796476 4755 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.804403 4755 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.805740 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.808222 4755 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.809168 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.809260 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.809313 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.809374 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809752 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809788 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809800 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809810 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809829 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809841 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809855 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809873 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809884 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809895 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809911 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.809923 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.810935 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.811760 4755 server.go:1280] "Started kubelet" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.812439 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.813411 4755 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.813378 4755 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 08:22:23 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.814839 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.814881 4755 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.814932 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:02:47.715818631 +0000 UTC Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.815012 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2007h40m23.900812248s for next certificate rotation Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.815500 4755 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.815537 4755 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.815601 4755 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.815700 4755 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.815783 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.817266 4755 factory.go:55] Registering systemd factory Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.817315 4755 factory.go:221] Registration of the systemd container factory successfully Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.817348 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.817428 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.817915 4755 server.go:460] "Adding debug handlers to kubelet server" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.817964 4755 factory.go:153] Registering CRI-O factory Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.817999 4755 factory.go:221] Registration of the crio container factory successfully Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.818096 4755 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.818141 4755 factory.go:103] Registering Raw factory Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.818167 4755 manager.go:1196] Started watching for new ooms in manager Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.818060 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.249:6443: connect: connection refused" interval="200ms" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.819470 4755 manager.go:319] Starting recovery of all containers Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.820497 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.249:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bd93ef35159d7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 08:22:23.811705303 +0000 UTC m=+0.641020547,LastTimestamp:2025-10-06 08:22:23.811705303 +0000 UTC m=+0.641020547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.834995 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835133 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835169 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835197 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835219 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835239 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835334 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835357 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835384 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835405 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835427 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835447 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835467 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835495 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835516 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835535 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835554 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835605 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835625 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835648 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835670 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835689 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835710 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835732 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835753 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835776 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835804 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835828 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835850 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835871 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835892 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835912 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835931 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835951 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835972 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.835993 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836010 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836030 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836050 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836073 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836092 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836110 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836130 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836149 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836168 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836191 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836252 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836273 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836293 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836313 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836473 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836495 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836526 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836547 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836596 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836617 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836641 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836660 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836685 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836714 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836742 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836768 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836791 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836833 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836858 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836883 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836908 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836935 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836959 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.836986 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837010 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837035 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837062 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837088 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837113 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837135 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837159 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837186 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837210 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837236 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837265 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837292 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837318 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837342 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837366 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837391 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837419 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837448 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837475 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837501 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837524 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837549 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837611 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837669 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837693 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837829 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837861 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837890 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837916 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837944 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837971 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.837999 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838034 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838062 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838095 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838123 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838150 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838184 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838214 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838241 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838271 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838301 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838329 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838357 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838392 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838420 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838447 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838476 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838503 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838529 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838553 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838622 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838649 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838680 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838703 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838727 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838754 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838778 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838801 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838826 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838852 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838877 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838906 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838936 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838962 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.838987 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839010 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839035 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839100 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839130 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839154 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839181 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839207 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839226 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839248 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839272 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839291 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839313 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839335 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839355 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.839375 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841524 4755 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841676 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841703 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841736 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841763 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841792 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841819 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841847 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841874 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841901 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841932 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841957 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841978 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.841999 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842019 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842040 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842061 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842084 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842104 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842126 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842147 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842166 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842185 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842204 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842245 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842265 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842284 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842303 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842325 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842343 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842362 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842381 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842399 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842418 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842440 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842461 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842480 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842498 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842517 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842538 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842556 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842631 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842685 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842717 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842742 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842762 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842783 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842805 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842825 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842845 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842865 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842888 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842907 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842927 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842948 4755 reconstruct.go:97] "Volume reconstruction finished" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.842964 4755 reconciler.go:26] "Reconciler: start to sync state" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.859361 4755 manager.go:324] Recovery completed Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.871216 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874858 4755 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874873 4755 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.874893 4755 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.875332 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.877483 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.877516 4755 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.877535 4755 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.877583 4755 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 08:22:23 crc kubenswrapper[4755]: W1006 08:22:23.878421 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.878502 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.901670 4755 policy_none.go:49] "None policy: Start" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.902195 4755 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.902219 4755 state_mem.go:35] "Initializing new in-memory state store" Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.916093 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.969306 4755 manager.go:334] "Starting Device Plugin manager" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.969746 4755 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.969775 4755 server.go:79] "Starting device plugin registration server" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.970461 4755 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.970502 4755 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.970880 4755 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.971314 4755 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.971334 4755 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.978250 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.978389 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.980445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.980514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.980541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.981346 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.983603 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.983724 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.985174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.985250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.985271 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986357 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986450 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.986480 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.987627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.987672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.987688 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.987970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.988068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.988105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.988481 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.988769 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.988858 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990812 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.990924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.991017 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: E1006 08:22:23.991350 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.991914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.991968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.991991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.992207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.992274 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.992305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.992318 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.992365 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.993546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.993657 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:23 crc kubenswrapper[4755]: I1006 08:22:23.993684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.019794 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.249:6443: connect: connection refused" interval="400ms" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045186 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045345 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045602 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045715 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045766 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045812 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.045956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.046001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.046044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.046084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.071274 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.072884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.072929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.072937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.072968 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.073653 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.249:6443: connect: connection refused" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.147973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148198 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148319 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148238 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148379 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148536 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148685 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.148737 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.273831 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.275333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.275410 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.275434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.275473 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.275883 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.249:6443: connect: connection refused" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.337080 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.349943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.373234 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.383675 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.401858 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d061860c5b9acc22a6b232a2b24abe4d50a492f0a591dff2b0f0aa4cc67c49ad WatchSource:0}: Error finding container d061860c5b9acc22a6b232a2b24abe4d50a492f0a591dff2b0f0aa4cc67c49ad: Status 404 returned error can't find the container with id d061860c5b9acc22a6b232a2b24abe4d50a492f0a591dff2b0f0aa4cc67c49ad Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.404484 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0fa0b9b0efb59f416194c656d365f285f9b9ab2411308cf5fe2e438297166526 WatchSource:0}: Error finding container 0fa0b9b0efb59f416194c656d365f285f9b9ab2411308cf5fe2e438297166526: Status 404 returned error can't find the container with id 0fa0b9b0efb59f416194c656d365f285f9b9ab2411308cf5fe2e438297166526 Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.407962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.408291 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cbffecd0e4163cdc7ca7cf118fb93f94e31d0891bb2ba85e8acae86515144b6d WatchSource:0}: Error finding container cbffecd0e4163cdc7ca7cf118fb93f94e31d0891bb2ba85e8acae86515144b6d: Status 404 returned error can't find the container with id cbffecd0e4163cdc7ca7cf118fb93f94e31d0891bb2ba85e8acae86515144b6d Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.420705 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.249:6443: connect: connection refused" interval="800ms" Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.420798 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-287ff76880a5848c5ce131fa2de59e8b91ffc2542f10b3f43b8e68effd29695e WatchSource:0}: Error finding container 287ff76880a5848c5ce131fa2de59e8b91ffc2542f10b3f43b8e68effd29695e: Status 404 returned error can't find the container with id 287ff76880a5848c5ce131fa2de59e8b91ffc2542f10b3f43b8e68effd29695e Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.429686 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8acbcb705f24d69e29d89b4316e1cebed419848c69a01b8f898f5c6aea4959e2 WatchSource:0}: Error finding container 8acbcb705f24d69e29d89b4316e1cebed419848c69a01b8f898f5c6aea4959e2: Status 404 returned error can't find the container with id 8acbcb705f24d69e29d89b4316e1cebed419848c69a01b8f898f5c6aea4959e2 Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.677023 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.679345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.679411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.679431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.679479 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.680381 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.249:6443: connect: connection refused" node="crc" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.814213 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:24 crc kubenswrapper[4755]: W1006 08:22:24.876359 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:24 crc kubenswrapper[4755]: E1006 08:22:24.876484 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.885629 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"287ff76880a5848c5ce131fa2de59e8b91ffc2542f10b3f43b8e68effd29695e"} Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.887585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbffecd0e4163cdc7ca7cf118fb93f94e31d0891bb2ba85e8acae86515144b6d"} Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.888892 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fa0b9b0efb59f416194c656d365f285f9b9ab2411308cf5fe2e438297166526"} Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.890284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d061860c5b9acc22a6b232a2b24abe4d50a492f0a591dff2b0f0aa4cc67c49ad"} Oct 06 08:22:24 crc kubenswrapper[4755]: I1006 08:22:24.891485 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8acbcb705f24d69e29d89b4316e1cebed419848c69a01b8f898f5c6aea4959e2"} Oct 06 08:22:25 crc kubenswrapper[4755]: W1006 08:22:25.126471 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:25 crc kubenswrapper[4755]: E1006 08:22:25.126626 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:25 crc kubenswrapper[4755]: E1006 08:22:25.222437 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.249:6443: connect: connection refused" interval="1.6s" Oct 06 08:22:25 crc kubenswrapper[4755]: W1006 08:22:25.247149 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:25 crc kubenswrapper[4755]: E1006 08:22:25.247238 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:25 crc kubenswrapper[4755]: W1006 08:22:25.269587 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:25 crc kubenswrapper[4755]: E1006 08:22:25.269695 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.480523 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.482608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.482648 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.482659 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.482685 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:25 crc kubenswrapper[4755]: E1006 08:22:25.483171 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.249:6443: connect: connection refused" node="crc" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.813629 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.900496 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.900553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.900579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.903836 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31" exitCode=0 Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.903959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.904017 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.905187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.905216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.905227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.906235 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9" exitCode=0 Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.906307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.906349 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.909351 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.911754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.911991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.912009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.911922 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.912687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.912714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.913463 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e5a92d9b20a4ef845d9eb869c33e525fc0325261b2d9041cb8b2a9b8097cc2e3" exitCode=0 Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.913665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e5a92d9b20a4ef845d9eb869c33e525fc0325261b2d9041cb8b2a9b8097cc2e3"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.913687 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.914888 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.914914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.914926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.916007 4755 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292" exitCode=0 Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.916042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292"} Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.916148 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.918886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.918931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:25 crc kubenswrapper[4755]: I1006 08:22:25.918948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.813488 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:26 crc kubenswrapper[4755]: E1006 08:22:26.823891 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.249:6443: connect: connection refused" interval="3.2s" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.921605 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.922126 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.926446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.926488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.926503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.928803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.928844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.928863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.928875 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.930682 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762" exitCode=0 Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.930849 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.930870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.931826 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.931869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.931882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.933105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4af26ae6bcc459bdffb5b3d349c864e2bf5a8c9fdcebbf05a57b081788fb044f"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.933276 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.934157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.934182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.934191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.936626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.936708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.936738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0"} Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.936806 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.937656 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.937866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:26 crc kubenswrapper[4755]: I1006 08:22:26.937881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:26 crc kubenswrapper[4755]: W1006 08:22:26.950338 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:26 crc kubenswrapper[4755]: E1006 08:22:26.950469 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.083634 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.086094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.086154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.086179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.086224 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:27 crc kubenswrapper[4755]: E1006 08:22:27.086903 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.249:6443: connect: connection refused" node="crc" Oct 06 08:22:27 crc kubenswrapper[4755]: W1006 08:22:27.295504 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.249:6443: connect: connection refused Oct 06 08:22:27 crc kubenswrapper[4755]: E1006 08:22:27.295656 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.249:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.946593 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6"} Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.946751 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.948866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.948928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.948948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.950883 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476" exitCode=0 Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.950973 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476"} Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.951020 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.951042 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.951076 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.951185 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.951281 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.952980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953160 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953258 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:27 crc kubenswrapper[4755]: I1006 08:22:27.953215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.748135 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.759787 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4"} Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957616 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677"} Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957653 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233"} Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957672 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957731 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.957875 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959285 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:28 crc kubenswrapper[4755]: I1006 08:22:28.959370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969161 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91"} Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969245 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969277 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969338 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969361 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.969250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b"} Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971213 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971285 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971398 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971560 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:29 crc kubenswrapper[4755]: I1006 08:22:29.971672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.287467 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.292304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.292372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.292390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.292430 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.973424 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.975083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.975140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:30 crc kubenswrapper[4755]: I1006 08:22:30.975159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:31 crc kubenswrapper[4755]: I1006 08:22:31.022447 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:31 crc kubenswrapper[4755]: I1006 08:22:31.022868 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:31 crc kubenswrapper[4755]: I1006 08:22:31.025254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:31 crc kubenswrapper[4755]: I1006 08:22:31.025334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:31 crc kubenswrapper[4755]: I1006 08:22:31.025359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.138402 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.138666 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.140241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.140297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.140318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.362507 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.362706 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.362754 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.364326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.364404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:32 crc kubenswrapper[4755]: I1006 08:22:32.364430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.319592 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.319928 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.321621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.321680 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.321691 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.548394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.548819 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.551119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.551212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.551233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.735725 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.735939 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.737300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.737362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:33 crc kubenswrapper[4755]: I1006 08:22:33.737403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:33 crc kubenswrapper[4755]: E1006 08:22:33.992477 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:22:34 crc kubenswrapper[4755]: I1006 08:22:34.381146 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:34 crc kubenswrapper[4755]: I1006 08:22:34.381684 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:34 crc kubenswrapper[4755]: I1006 08:22:34.383894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:34 crc kubenswrapper[4755]: I1006 08:22:34.383952 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:34 crc kubenswrapper[4755]: I1006 08:22:34.383961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.381319 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.382043 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.703975 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.704157 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.705503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.705612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.705644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.763159 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]log ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]etcd ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/priority-and-fairness-filter ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-apiextensions-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/crd-informer-synced failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-system-namespaces-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/bootstrap-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-aggregator-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]autoregister-completion ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-openapi-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: livez check failed Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.763240 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.779946 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]log ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]etcd ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/priority-and-fairness-filter ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-apiextensions-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/crd-informer-synced failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-system-namespaces-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/bootstrap-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/start-kube-aggregator-informers ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 06 08:22:37 crc kubenswrapper[4755]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]autoregister-completion ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-openapi-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 06 08:22:37 crc kubenswrapper[4755]: livez check failed Oct 06 08:22:37 crc kubenswrapper[4755]: I1006 08:22:37.780038 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.142913 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.143099 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.144924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.144971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.144986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.152892 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.720410 4755 trace.go:236] Trace[1108042684]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:22:28.182) (total time: 14538ms): Oct 06 08:22:42 crc kubenswrapper[4755]: Trace[1108042684]: ---"Objects listed" error: 14537ms (08:22:42.720) Oct 06 08:22:42 crc kubenswrapper[4755]: Trace[1108042684]: [14.538061003s] [14.538061003s] END Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.720466 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.721747 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.722861 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.723427 4755 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.724049 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.726852 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.730258 4755 trace.go:236] Trace[544957668]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:22:28.133) (total time: 14597ms): Oct 06 08:22:42 crc kubenswrapper[4755]: Trace[544957668]: ---"Objects listed" error: 14596ms (08:22:42.729) Oct 06 08:22:42 crc kubenswrapper[4755]: Trace[544957668]: [14.597072111s] [14.597072111s] END Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.730308 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.808337 4755 apiserver.go:52] "Watching apiserver" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.814114 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.816025 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.816713 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.817088 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.817697 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.817357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.817528 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.818157 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.819012 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.821236 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.821360 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.821626 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.821972 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.822163 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.824859 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.825216 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.825363 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.825398 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.825690 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.827594 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.880381 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.895406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.913950 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.916875 4755 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926528 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926670 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926711 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926747 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926782 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926891 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.926998 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927028 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927092 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927162 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927252 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927597 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927632 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927670 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927704 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927740 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927928 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.927965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928038 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928071 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928190 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928235 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928308 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928342 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928412 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928445 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928481 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928554 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928615 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928549 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928682 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928751 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928785 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928819 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928856 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928929 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928500 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928803 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.928971 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929093 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929110 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929423 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929492 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929542 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929679 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929597 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929725 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929943 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930133 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930946 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930167 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930172 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930214 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930330 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930418 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930579 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.930644 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932019 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932325 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932608 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932664 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932664 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.932941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933030 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933500 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933592 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.933954 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.934147 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.934835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.934963 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935111 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935301 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935837 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935870 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.929043 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936049 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936217 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936252 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936347 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936361 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936436 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936474 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936510 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936610 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936647 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936687 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936730 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936772 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936885 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937043 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937097 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937250 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937288 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937326 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.939658 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.940134 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.940203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946938 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946994 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947026 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947052 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947169 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947277 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947301 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947374 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947394 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947504 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947596 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947622 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947653 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947684 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947713 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947737 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947786 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947816 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947841 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947860 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947908 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947933 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948001 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948023 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948095 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948116 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948270 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948315 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948338 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948387 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948407 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948450 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948470 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948572 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948596 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948621 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948640 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948666 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948730 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948751 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948789 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948813 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948840 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948865 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948887 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948934 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949001 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949069 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949091 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949132 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949155 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949244 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949269 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949297 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949326 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949358 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949442 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949518 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949957 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950017 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.936729 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.937158 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950147 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950155 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.935493 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.941661 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.942544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946294 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946869 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.946929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947343 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947386 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947685 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.947878 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948045 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948241 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.948987 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949260 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949505 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.949711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950093 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950109 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950249 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950288 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950362 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950162 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950591 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950618 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950633 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950646 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950659 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950676 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950690 4755 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950704 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950716 4755 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950731 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950783 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951077 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951094 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951106 4755 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951120 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951131 4755 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951142 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951159 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951172 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951183 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951193 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951206 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951218 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951229 4755 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951240 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951254 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951264 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951274 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951286 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951298 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951309 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951320 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951331 4755 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951344 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951353 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951365 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951377 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951389 4755 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951400 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951414 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951427 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951455 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951468 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951481 4755 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.950918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951497 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951511 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.951525 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.953283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.953893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.954154 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.955194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.957503 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.958545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.959027 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.959166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.959746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.959941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.960095 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.960250 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.960434 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.961248 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.962202 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.962332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.962434 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.962528 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963046 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963319 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963521 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963602 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.963884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964074 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964296 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964413 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964621 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964725 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.957911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.964978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.965343 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.965340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.965401 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.965548 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.966217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.966532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.966611 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.966811 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.966969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.967057 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.967221 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.967361 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.967480 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:43.467425682 +0000 UTC m=+20.296740896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.967646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.967732 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.968405 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.969235 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.969428 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.969440 4755 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.969478 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.970065 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.970300 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.970624 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.970928 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.971254 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.971304 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.971883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.968553 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.972179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.972476 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.972877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.972881 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.972070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.973286 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.973330 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.973337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.973409 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.973436 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979535 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979534 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979743 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979785 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.979874 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.980042 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.980163 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:43.480141669 +0000 UTC m=+20.309456883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.980359 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.981140 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.981327 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.981618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.981939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982399 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982647 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982682 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982728 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.982807 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.980088 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.983805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.984414 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.984982 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.985012 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.985376 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.986177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.986511 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.986703 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.986986 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.987120 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.987357 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.987822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988489 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988523 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988932 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.988942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.989262 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.989496 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.989520 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.989535 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.990741 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.992727 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:43.492696203 +0000 UTC m=+20.322011417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.993056 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:22:43.493045792 +0000 UTC m=+20.322361006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.993272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.993476 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.993501 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.993514 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:42 crc kubenswrapper[4755]: E1006 08:22:42.993544 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:43.493537053 +0000 UTC m=+20.322852267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.994134 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.994975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.996843 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.996981 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.997046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.997141 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:42 crc kubenswrapper[4755]: I1006 08:22:42.997265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.003728 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.004131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.004796 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.008829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.009224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.009241 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.011126 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.012448 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.012643 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.012680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.013856 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.014675 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.015722 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.015766 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.018876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.020854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052149 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052251 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052339 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052375 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052392 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052404 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052419 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052433 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052500 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052514 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052528 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052616 4755 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052633 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052647 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052661 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052673 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052686 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052702 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052716 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052729 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052743 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052756 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052768 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052781 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052793 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052805 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052819 4755 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052831 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052846 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052859 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052872 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052883 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052895 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052907 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052920 4755 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052932 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052947 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052960 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052973 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.052989 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053002 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053014 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053027 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053039 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053050 4755 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053062 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053074 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053085 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053100 4755 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053111 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053123 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053135 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053146 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053175 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053187 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053201 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053212 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053226 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053237 4755 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053249 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053264 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053277 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053292 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053304 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053316 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053328 4755 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053339 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053351 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053363 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053376 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053386 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053395 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053404 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053412 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053421 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053429 4755 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053438 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053446 4755 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053456 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053465 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053473 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053483 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053496 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053510 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053522 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053536 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053549 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053588 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053600 4755 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053612 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053625 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053638 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053650 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053662 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053674 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053687 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053698 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053708 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053718 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053727 4755 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053736 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053745 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053754 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053765 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053775 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053785 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053794 4755 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053805 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053814 4755 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053823 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053833 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053843 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053852 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053862 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053870 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053900 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053909 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053919 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053927 4755 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053936 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053945 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053953 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053962 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053972 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053980 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053989 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.053999 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054007 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054016 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054025 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054033 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054043 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054052 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054061 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054069 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054077 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054086 4755 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054096 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054105 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054114 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054121 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054130 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.054142 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.096653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.109951 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.143119 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.143421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.146951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.148610 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.154755 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.154793 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.154803 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.154812 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.159707 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:22:43 crc kubenswrapper[4755]: W1006 08:22:43.161891 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1ed483b2bb9f7be02e3d2e68b52c6cdada0e49644b25ef8cdb37f3b17b06e895 WatchSource:0}: Error finding container 1ed483b2bb9f7be02e3d2e68b52c6cdada0e49644b25ef8cdb37f3b17b06e895: Status 404 returned error can't find the container with id 1ed483b2bb9f7be02e3d2e68b52c6cdada0e49644b25ef8cdb37f3b17b06e895 Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.168352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:22:43 crc kubenswrapper[4755]: W1006 08:22:43.193616 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-304ba58a76778a2ce7f89e0f40edd6965cdd12affc318379ea2a90b3aa2732a3 WatchSource:0}: Error finding container 304ba58a76778a2ce7f89e0f40edd6965cdd12affc318379ea2a90b3aa2732a3: Status 404 returned error can't find the container with id 304ba58a76778a2ce7f89e0f40edd6965cdd12affc318379ea2a90b3aa2732a3 Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.540768 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jxm75"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.541143 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.545232 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.546946 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.547162 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.556986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.557058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.557095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.557136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.557181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557277 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557330 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:44.55730655 +0000 UTC m=+21.386621764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557672 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:22:44.557663628 +0000 UTC m=+21.386978832 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557711 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557736 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:44.55772932 +0000 UTC m=+21.387044534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557784 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557794 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557805 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557827 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:44.557820022 +0000 UTC m=+21.387135236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557863 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557871 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557878 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:43 crc kubenswrapper[4755]: E1006 08:22:43.557897 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:44.557891734 +0000 UTC m=+21.387206948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.563371 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.579723 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.597799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.610252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.621394 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.633578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.645231 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.684771 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-hosts-file\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.684837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzb9r\" (UniqueName: \"kubernetes.io/projected/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-kube-api-access-kzb9r\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.699257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.740841 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.750187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.763729 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.780207 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.785291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-hosts-file\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.785332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzb9r\" (UniqueName: \"kubernetes.io/projected/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-kube-api-access-kzb9r\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.785513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-hosts-file\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.792509 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.807649 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzb9r\" (UniqueName: \"kubernetes.io/projected/5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1-kube-api-access-kzb9r\") pod \"node-resolver-jxm75\" (UID: \"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\") " pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.809538 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.823079 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.834703 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.838261 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.849447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.851683 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jxm75" Oct 06 08:22:43 crc kubenswrapper[4755]: W1006 08:22:43.864971 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff8aa79_3b9f_472a_9a36_0e92cbf9e6f1.slice/crio-18816f8dfd31e7e8de431a5cd4d8f2d5917208226dba95a6971b43f6312b0dbf WatchSource:0}: Error finding container 18816f8dfd31e7e8de431a5cd4d8f2d5917208226dba95a6971b43f6312b0dbf: Status 404 returned error can't find the container with id 18816f8dfd31e7e8de431a5cd4d8f2d5917208226dba95a6971b43f6312b0dbf Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.883891 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.884702 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.885810 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.886684 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.887380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.887864 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.888849 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.889357 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.890318 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.890842 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.892712 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.893520 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.896736 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.897273 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.897840 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.898750 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.899341 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.903642 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.904243 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.904540 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.904891 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.905877 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.906457 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.906950 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.908112 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.908754 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.910976 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.911673 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.913772 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.914395 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.915251 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.915713 4755 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.915825 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.918829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.918859 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.919506 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.919981 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.921484 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.922554 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.923111 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.924110 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.924863 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.925710 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.926273 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.927220 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.927795 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.928705 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.929267 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.930195 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.930894 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.931790 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.932259 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.933105 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.933632 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.933751 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.934285 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.935080 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.935529 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r96nx"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.935861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r96nx" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.936936 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xsg89"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.937690 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.942044 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.942346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.948791 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rfqsq"] Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.949454 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.949971 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.950623 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.950762 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.951668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.952125 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.952468 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.954783 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.956027 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.956679 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.957037 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 08:22:43 crc kubenswrapper[4755]: I1006 08:22:43.961156 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.000389 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.021825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jxm75" event={"ID":"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1","Type":"ContainerStarted","Data":"18816f8dfd31e7e8de431a5cd4d8f2d5917208226dba95a6971b43f6312b0dbf"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.031846 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.049053 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.053684 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6" exitCode=255 Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.053759 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.054318 4755 scope.go:117] "RemoveContainer" containerID="b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.062289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.062343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.062354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"304ba58a76778a2ce7f89e0f40edd6965cdd12affc318379ea2a90b3aa2732a3"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.067196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a7546411e6ac012798db01910ad135d16628b74994126b4e8317f28439bf59f5"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.068993 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.069033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1ed483b2bb9f7be02e3d2e68b52c6cdada0e49644b25ef8cdb37f3b17b06e895"} Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.085218 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088706 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-multus-daemon-config\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-rootfs\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-cni-binary-copy\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088826 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-netns\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-hostroot\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088913 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcggh\" (UniqueName: \"kubernetes.io/projected/891dff9a-4752-4022-83fc-51f626c76991-kube-api-access-mcggh\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-system-cni-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.088962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-cnibin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089032 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-kubelet\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089097 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjlh\" (UniqueName: \"kubernetes.io/projected/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-kube-api-access-prjlh\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-socket-dir-parent\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-k8s-cni-cncf-io\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089162 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-proxy-tls\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089208 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-os-release\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-conf-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-multus\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089327 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-os-release\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-system-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089422 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-etc-kubernetes\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089480 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cnibin\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-bin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-multus-certs\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.089588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4kk\" (UniqueName: \"kubernetes.io/projected/b19d445e-b55b-46be-ab4f-ad2d72a966b7-kube-api-access-bt4kk\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.114060 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.132313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.148743 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.184371 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-multus-certs\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-etc-kubernetes\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cnibin\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-bin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190878 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4kk\" (UniqueName: \"kubernetes.io/projected/b19d445e-b55b-46be-ab4f-ad2d72a966b7-kube-api-access-bt4kk\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190895 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-cni-binary-copy\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-multus-daemon-config\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190938 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-rootfs\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190955 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.190972 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-netns\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-hostroot\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcggh\" (UniqueName: \"kubernetes.io/projected/891dff9a-4752-4022-83fc-51f626c76991-kube-api-access-mcggh\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-cnibin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191079 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-system-cni-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191122 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-kubelet\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191143 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjlh\" (UniqueName: \"kubernetes.io/projected/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-kube-api-access-prjlh\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191158 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-proxy-tls\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191173 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-socket-dir-parent\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-k8s-cni-cncf-io\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191206 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-os-release\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-conf-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191277 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-multus\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-os-release\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191318 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-system-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-multus-certs\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191751 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-etc-kubernetes\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cnibin\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.191814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-bin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.192043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-kubelet\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.192675 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-socket-dir-parent\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.192729 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-cni-binary-copy\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.192739 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-netns\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.192771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-hostroot\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-run-k8s-cni-cncf-io\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193220 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-host-var-lib-cni-multus\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/891dff9a-4752-4022-83fc-51f626c76991-multus-daemon-config\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-rootfs\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193363 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-cnibin\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-system-cni-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193820 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-multus-conf-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193913 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-os-release\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.193964 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/891dff9a-4752-4022-83fc-51f626c76991-system-cni-dir\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.194022 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-os-release\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.194327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b19d445e-b55b-46be-ab4f-ad2d72a966b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.194545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b19d445e-b55b-46be-ab4f-ad2d72a966b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.198979 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-proxy-tls\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.209134 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.214924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjlh\" (UniqueName: \"kubernetes.io/projected/854f4c9e-3c8a-47bb-9427-bb5bfc5691d7-kube-api-access-prjlh\") pod \"machine-config-daemon-rfqsq\" (UID: \"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\") " pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.215012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4kk\" (UniqueName: \"kubernetes.io/projected/b19d445e-b55b-46be-ab4f-ad2d72a966b7-kube-api-access-bt4kk\") pod \"multus-additional-cni-plugins-xsg89\" (UID: \"b19d445e-b55b-46be-ab4f-ad2d72a966b7\") " pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.221625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcggh\" (UniqueName: \"kubernetes.io/projected/891dff9a-4752-4022-83fc-51f626c76991-kube-api-access-mcggh\") pod \"multus-r96nx\" (UID: \"891dff9a-4752-4022-83fc-51f626c76991\") " pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.233038 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.255220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.269965 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r96nx" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.272583 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: W1006 08:22:44.285132 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891dff9a_4752_4022_83fc_51f626c76991.slice/crio-6458043afc2ec39411873159bf46dab7afb43a6182c9ac6132f972212d387693 WatchSource:0}: Error finding container 6458043afc2ec39411873159bf46dab7afb43a6182c9ac6132f972212d387693: Status 404 returned error can't find the container with id 6458043afc2ec39411873159bf46dab7afb43a6182c9ac6132f972212d387693 Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.291088 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.305891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.324758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.337354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xsg89" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.342528 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.348936 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.369145 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r8qq9"] Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.370109 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.373155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.373420 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.373798 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.373990 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.374116 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.374228 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.374427 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.377284 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.393468 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.398447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.415750 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.431410 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.451655 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.473236 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.489895 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493833 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493907 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493954 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493970 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.493987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494061 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22sj\" (UniqueName: \"kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494109 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494135 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.494184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.505213 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.520767 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.541133 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.558669 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.580923 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596602 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.596795 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:22:46.5967516 +0000 UTC m=+23.426066814 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596847 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.596897 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597185 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597252 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22sj\" (UniqueName: \"kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597312 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597370 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597456 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597511 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597548 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597601 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597620 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597673 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:46.597663852 +0000 UTC m=+23.426979066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597760 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597895 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597922 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.597935 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.598014 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:46.597966069 +0000 UTC m=+23.427281283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.597532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.598170 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.598201 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:46.598193905 +0000 UTC m=+23.427509119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.598207 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598231 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.598239 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:46.598231977 +0000 UTC m=+23.427547181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598310 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598393 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598582 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.598639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.599211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.599211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.599254 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.604209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.616986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22sj\" (UniqueName: \"kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj\") pod \"ovnkube-node-r8qq9\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.626323 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.662842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.688218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.697967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.878481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.878535 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.878630 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:44 crc kubenswrapper[4755]: I1006 08:22:44.878735 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.878950 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:44 crc kubenswrapper[4755]: E1006 08:22:44.878767 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.074468 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7" exitCode=0 Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.074534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.074578 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"402b7fc3b000089f7775a166774f0b7b9c7478c425671def97e65a93a6d825c5"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.076626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerStarted","Data":"316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.076652 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerStarted","Data":"6458043afc2ec39411873159bf46dab7afb43a6182c9ac6132f972212d387693"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.079423 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.079481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.079502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"3811da992fe514b391284294cf22ade31a05c1359eb75f1f1e96684de72f4cf5"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.081164 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e" exitCode=0 Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.081238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.081288 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerStarted","Data":"1e2eee6e35c52a720c0a893a520859bf10175de48a97cb65cd90177ce5282f9c"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.083411 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.086388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.086847 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.088406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jxm75" event={"ID":"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1","Type":"ContainerStarted","Data":"f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64"} Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.091941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.102665 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.120176 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.148582 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.180137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.208592 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.256660 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.328335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.352805 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.421854 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.441621 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.468986 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.484942 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.502881 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.522589 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.556190 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.574815 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.600607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.616095 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.632403 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.648200 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.663472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.679387 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.702248 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.716246 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.731725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:45 crc kubenswrapper[4755]: I1006 08:22:45.759516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.095459 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f" exitCode=0 Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.095557 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.099557 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108140 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108206 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.108229 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f"} Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.116183 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.134212 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.148021 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.158260 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.174183 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.188150 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.202812 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.218473 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.239044 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.254003 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.268539 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.294485 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.315602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.332679 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.346843 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.364101 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.376429 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.385620 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.397718 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.410007 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.426005 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.441852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.458629 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.472252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.489523 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.505736 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.665845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666055 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:22:50.666023818 +0000 UTC m=+27.495339032 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.666396 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666524 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666613 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:50.666602952 +0000 UTC m=+27.495918346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.666545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666790 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666829 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666846 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.666798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.666883 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:50.666872009 +0000 UTC m=+27.496187403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.667050 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667199 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667267 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667338 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667552 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:50.667532464 +0000 UTC m=+27.496847678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667361 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.667740 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:50.66772866 +0000 UTC m=+27.497043874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.877985 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.878094 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.878179 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.878267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:46 crc kubenswrapper[4755]: I1006 08:22:46.878470 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:46 crc kubenswrapper[4755]: E1006 08:22:46.878726 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.114488 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37" exitCode=0 Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.114556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37"} Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.129730 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.148639 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.165238 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.182222 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.196856 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.211497 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.237166 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.258659 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.277693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.296195 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.315922 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.332748 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.350972 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.735051 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.749516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.751503 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.752330 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.766256 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.780627 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.794025 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.809818 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.822367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.837519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mh26r"] Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.837994 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.840694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.840914 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.841032 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.841996 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.856619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.875329 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.881180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aab0aad-4968-4984-92fe-b4920f08da9f-serviceca\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.881231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fq5j\" (UniqueName: \"kubernetes.io/projected/4aab0aad-4968-4984-92fe-b4920f08da9f-kube-api-access-7fq5j\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.881258 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aab0aad-4968-4984-92fe-b4920f08da9f-host\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.892842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.908693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.922619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.938794 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.966258 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.980424 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.982715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aab0aad-4968-4984-92fe-b4920f08da9f-serviceca\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.981733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aab0aad-4968-4984-92fe-b4920f08da9f-serviceca\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.982813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq5j\" (UniqueName: \"kubernetes.io/projected/4aab0aad-4968-4984-92fe-b4920f08da9f-kube-api-access-7fq5j\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.982838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aab0aad-4968-4984-92fe-b4920f08da9f-host\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.982978 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aab0aad-4968-4984-92fe-b4920f08da9f-host\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:47 crc kubenswrapper[4755]: I1006 08:22:47.994129 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.004157 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fq5j\" (UniqueName: \"kubernetes.io/projected/4aab0aad-4968-4984-92fe-b4920f08da9f-kube-api-access-7fq5j\") pod \"node-ca-mh26r\" (UID: \"4aab0aad-4968-4984-92fe-b4920f08da9f\") " pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.013967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.027172 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.049767 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.063166 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.074711 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.089196 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.102838 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.116602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.131636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97"} Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.134217 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910" exitCode=0 Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.134672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910"} Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.142399 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.153179 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mh26r" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.166857 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: W1006 08:22:48.169535 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aab0aad_4968_4984_92fe_b4920f08da9f.slice/crio-7b9f0443d26e8f968773823300fea3be6821b766f6f0d29db9522a658baf4754 WatchSource:0}: Error finding container 7b9f0443d26e8f968773823300fea3be6821b766f6f0d29db9522a658baf4754: Status 404 returned error can't find the container with id 7b9f0443d26e8f968773823300fea3be6821b766f6f0d29db9522a658baf4754 Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.190951 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.207277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.226228 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.238785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.254221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.269216 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.286087 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.300195 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.313283 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.325501 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.342616 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.372827 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.395468 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.438466 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.459661 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.536903 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.554794 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.570755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.878354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.878433 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:48 crc kubenswrapper[4755]: E1006 08:22:48.878519 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:48 crc kubenswrapper[4755]: I1006 08:22:48.878451 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:48 crc kubenswrapper[4755]: E1006 08:22:48.878607 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:48 crc kubenswrapper[4755]: E1006 08:22:48.878712 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.124138 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.126674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.126717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.126729 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.126895 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.136524 4755 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.136951 4755 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.142178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.142238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.142248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.142262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.142272 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.145969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mh26r" event={"ID":"4aab0aad-4968-4984-92fe-b4920f08da9f","Type":"ContainerStarted","Data":"45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.146021 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mh26r" event={"ID":"4aab0aad-4968-4984-92fe-b4920f08da9f","Type":"ContainerStarted","Data":"7b9f0443d26e8f968773823300fea3be6821b766f6f0d29db9522a658baf4754"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.153003 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd" exitCode=0 Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.153076 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd"} Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.162886 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.166231 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.168754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.169020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.169038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.169062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.169074 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.183766 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.186538 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.188687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.188882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.188996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.189137 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.189252 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.199869 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.202312 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.205486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.205512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.205521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.205538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.205550 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.215263 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.218831 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.223132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.223182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.223197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.223213 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.223225 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.229092 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.240204 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: E1006 08:22:49.240366 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.243728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.243782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.243795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.243815 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.243827 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.244143 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.264967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.278088 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.291940 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.306262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.326478 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.343102 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.346777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.346835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.346855 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.346945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.346989 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.356702 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.368253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.380927 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.406701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.423453 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.442153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.450139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.450188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.450200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.450220 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.450234 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.458408 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.474870 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.493864 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.509874 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.523411 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.553708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.553778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.553798 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.553824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.553842 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.557929 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.592398 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.636687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.656878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.657230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.657341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.657430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.657513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.678020 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.715357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.759443 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.760575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.760596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.760604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.760617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.760626 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.798809 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.864314 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.864374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.864391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.864413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.864430 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.967916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.967988 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.968003 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.968025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:49 crc kubenswrapper[4755]: I1006 08:22:49.968039 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:49Z","lastTransitionTime":"2025-10-06T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.070697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.070744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.070757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.070777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.070790 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.166376 4755 generic.go:334] "Generic (PLEG): container finished" podID="b19d445e-b55b-46be-ab4f-ad2d72a966b7" containerID="45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734" exitCode=0 Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.166473 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerDied","Data":"45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.173365 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.173422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.173446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.173470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.173488 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.189989 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.223000 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.239042 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.258741 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.277545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.277606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.277618 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.277639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.277653 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.288338 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.306725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.320835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.337745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.352244 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.367701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.380806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.380918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.380929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.380947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.380958 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.383980 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.405786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.419060 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.437514 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.451227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.483726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.483753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.483764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.483782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.483793 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.586604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.586665 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.586678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.587010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.587074 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.690608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.690669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.690687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.690708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.690726 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.757324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.757705 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.757843 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:22:58.757793262 +0000 UTC m=+35.587108516 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.757936 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758025 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:58.757999127 +0000 UTC m=+35.587314371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758169 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758286 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:58.758240943 +0000 UTC m=+35.587556237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.758006 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.758455 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.758529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758803 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758837 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758862 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758880 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758891 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758908 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.758980 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:58.758957491 +0000 UTC m=+35.588272775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.759041 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:58.758999461 +0000 UTC m=+35.588314715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.794544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.794618 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.794630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.794711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.794731 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.878258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.878346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.878444 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.878549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.878863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:50 crc kubenswrapper[4755]: E1006 08:22:50.878967 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.898109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.898342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.898401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.898534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:50 crc kubenswrapper[4755]: I1006 08:22:50.898619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:50Z","lastTransitionTime":"2025-10-06T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.002820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.002876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.002886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.002909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.002926 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.106851 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.106915 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.106933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.106961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.106982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.177543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" event={"ID":"b19d445e-b55b-46be-ab4f-ad2d72a966b7","Type":"ContainerStarted","Data":"45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.187195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.187815 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.210238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.210302 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.210321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.210347 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.210366 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.213164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.229397 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.235361 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.260875 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.282858 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.298187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.313942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.313974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.313982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.314000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.314012 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.317983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.336261 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.353252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.370239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.386764 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.398419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.414455 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.419061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.419098 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.419113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.419134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.419149 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.427468 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.458726 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.475436 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.513807 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.522633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.522703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.522725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.522752 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.522772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.533290 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.554103 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.577823 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.601286 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.617360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.625882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.625928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.625940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.625988 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.626003 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.645410 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.666699 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.682637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.697742 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.716428 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.729554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.729606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.729614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.729634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.729648 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.732395 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.750644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.777830 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.793294 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.832920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.832992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.833015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.833044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.833068 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.935837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.935868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.935876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.935889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:51 crc kubenswrapper[4755]: I1006 08:22:51.935898 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:51Z","lastTransitionTime":"2025-10-06T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.039292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.039335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.039350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.039369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.039385 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.143591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.143633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.143644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.143669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.143687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.192397 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.192474 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.235506 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.245926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.245952 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.245960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.245975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.245984 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.251625 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.265993 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.283224 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.298805 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.313152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.335815 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.352053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.352107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.352120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.352142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.352153 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.355971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.372687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.388919 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.409103 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.430217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.451100 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.455711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.455747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.455757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.455775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.455788 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.475259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.497237 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.521779 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:52Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.559131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.559175 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.559187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.559205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.559220 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.661629 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.661874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.661940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.662017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.662077 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.765169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.765709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.765789 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.765854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.765928 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.868675 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.868713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.868723 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.868738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.868751 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.878504 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.878697 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.878508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:52 crc kubenswrapper[4755]: E1006 08:22:52.879002 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:52 crc kubenswrapper[4755]: E1006 08:22:52.879143 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:52 crc kubenswrapper[4755]: E1006 08:22:52.879267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.971910 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.972202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.972287 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.972372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:52 crc kubenswrapper[4755]: I1006 08:22:52.972444 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:52Z","lastTransitionTime":"2025-10-06T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.074824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.074848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.074856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.074869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.074877 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.177250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.177289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.177300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.177318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.177330 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.195998 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/0.log" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.198857 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc" exitCode=1 Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.198897 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.199645 4755 scope.go:117] "RemoveContainer" containerID="9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.217172 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.243007 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:53Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:22:53.156519 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:22:53.156536 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:22:53.156601 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:22:53.156651 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:22:53.156662 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:22:53.156654 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:22:53.156688 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:22:53.156701 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:22:53.156705 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:22:53.156718 6019 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:22:53.156716 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:22:53.156703 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:22:53.156775 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:22:53.156788 6019 factory.go:656] Stopping watch factory\\\\nI1006 08:22:53.156819 6019 ovnkube.go:599] Stopped ovnkube\\\\nI1006 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.258409 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.272628 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.279828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.279871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.279884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.279903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.279915 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.294031 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.306958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.323545 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.348424 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.364329 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.383430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.383502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.383526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.383556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.383666 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.387401 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.402911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.422000 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.433292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.449437 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.463825 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.485626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.485666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.485676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.485692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.485705 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.587751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.587796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.587807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.587828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.587842 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.690945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.691009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.691023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.691046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.691065 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.793255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.793293 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.793305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.793321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.793333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.892367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.895426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.895464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.895477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.895494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.895507 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.906077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.919145 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.932991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.946259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.963086 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.987428 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.997244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.997292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.997304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.997322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:53 crc kubenswrapper[4755]: I1006 08:22:53.997335 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:53Z","lastTransitionTime":"2025-10-06T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.002509 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.019522 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.033487 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.047089 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.068813 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:53Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:22:53.156519 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:22:53.156536 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:22:53.156601 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:22:53.156651 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:22:53.156662 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:22:53.156654 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:22:53.156688 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:22:53.156701 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:22:53.156705 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:22:53.156718 6019 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:22:53.156716 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:22:53.156703 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:22:53.156775 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:22:53.156788 6019 factory.go:656] Stopping watch factory\\\\nI1006 08:22:53.156819 6019 ovnkube.go:599] Stopped ovnkube\\\\nI1006 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.083758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.100454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.100526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.100547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.100609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.100632 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.111297 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.132639 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.202826 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.202876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.202884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.202925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.202934 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.204520 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/0.log" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.206847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.207266 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.228811 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.241697 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.254830 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.268600 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.280959 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.300134 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.306094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.306125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.306134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.306147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.306157 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.315596 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.336051 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.352821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.370766 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.390131 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:53Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:22:53.156519 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:22:53.156536 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:22:53.156601 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:22:53.156651 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:22:53.156662 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:22:53.156654 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:22:53.156688 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:22:53.156701 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:22:53.156705 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:22:53.156718 6019 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:22:53.156716 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:22:53.156703 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:22:53.156775 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:22:53.156788 6019 factory.go:656] Stopping watch factory\\\\nI1006 08:22:53.156819 6019 ovnkube.go:599] Stopped ovnkube\\\\nI1006 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.404906 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.408553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.408622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.408633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.408648 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.408657 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.422424 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.438055 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.456623 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.511662 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.511717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.511730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.511749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.511764 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.614409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.614459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.614475 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.614497 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.614514 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.720541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.720660 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.720687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.720716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.720741 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.823419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.823468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.823477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.823490 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.823499 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.878679 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.878735 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.878796 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:54 crc kubenswrapper[4755]: E1006 08:22:54.878886 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:54 crc kubenswrapper[4755]: E1006 08:22:54.879212 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:54 crc kubenswrapper[4755]: E1006 08:22:54.879261 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.926820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.926918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.926966 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.926997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:54 crc kubenswrapper[4755]: I1006 08:22:54.927018 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:54Z","lastTransitionTime":"2025-10-06T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.029906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.030081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.030105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.030171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.030192 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.134058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.134126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.134143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.134168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.134186 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.212210 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/1.log" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.213760 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/0.log" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.216430 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4" exitCode=1 Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.216479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.216550 4755 scope.go:117] "RemoveContainer" containerID="9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.217814 4755 scope.go:117] "RemoveContainer" containerID="92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4" Oct 06 08:22:55 crc kubenswrapper[4755]: E1006 08:22:55.218164 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.237074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.237666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.237797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.237922 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.238034 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.242465 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.259419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.281897 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.301291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.318772 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.339581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.342170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.342210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.342251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.342269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.342281 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.359602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.381279 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.401960 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.424721 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.442260 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.444942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.444978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.444990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.445006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.445017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.463345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.498014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:53Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:22:53.156519 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:22:53.156536 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:22:53.156601 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:22:53.156651 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:22:53.156662 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:22:53.156654 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:22:53.156688 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:22:53.156701 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:22:53.156705 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:22:53.156718 6019 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:22:53.156716 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:22:53.156703 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:22:53.156775 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:22:53.156788 6019 factory.go:656] Stopping watch factory\\\\nI1006 08:22:53.156819 6019 ovnkube.go:599] Stopped ovnkube\\\\nI1006 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.518219 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.547929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.548015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.548067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.548103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.548127 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.549915 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.637351 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.651870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.651938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.651956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.651980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.651998 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.671994 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9155712188d9788ff0c6e1cb26c8142eecb377f07cd66714651a97317be9a6fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:53Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:22:53.156519 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:22:53.156536 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:22:53.156601 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:22:53.156651 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:22:53.156662 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:22:53.156654 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:22:53.156688 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:22:53.156701 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:22:53.156705 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:22:53.156718 6019 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:22:53.156716 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:22:53.156703 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:22:53.156775 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:22:53.156788 6019 factory.go:656] Stopping watch factory\\\\nI1006 08:22:53.156819 6019 ovnkube.go:599] Stopped ovnkube\\\\nI1006 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.687696 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.706376 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.727408 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755653 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755670 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.755972 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.779528 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.795160 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.806441 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.822749 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.833995 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.848359 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.857365 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.857410 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.857425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.857446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.857462 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.860311 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.880762 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.892927 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.910187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.960702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.960757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.960781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.960810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:55 crc kubenswrapper[4755]: I1006 08:22:55.960833 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:55Z","lastTransitionTime":"2025-10-06T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.064209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.064251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.064262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.064278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.064288 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.167386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.167453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.167523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.167542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.167554 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.222622 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/1.log" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.227983 4755 scope.go:117] "RemoveContainer" containerID="92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4" Oct 06 08:22:56 crc kubenswrapper[4755]: E1006 08:22:56.228249 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.250693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.270298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.270372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.270390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.270415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.270432 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.286101 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.301725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.321237 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.353368 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.366981 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.373320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.373383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.373400 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.373425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.373448 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.383852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.399167 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.417678 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.440239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.459906 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.476311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.476359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.476371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.476389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.476402 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.480422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.496357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.513211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.528072 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.541606 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn"] Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.542006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.544276 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.544718 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.558302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.572302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.578598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.578637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.578648 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.578663 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.578675 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.588017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.600205 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.611464 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.619653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.619752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxvs\" (UniqueName: \"kubernetes.io/projected/dfe4c263-9750-4b65-b308-b998f3fa1eae-kube-api-access-qrxvs\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.619963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.620062 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.628801 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.648244 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.667134 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.681359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.681394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.681403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.681418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.681428 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.684637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.695899 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.713094 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.721315 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.721369 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.721435 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.721484 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxvs\" (UniqueName: \"kubernetes.io/projected/dfe4c263-9750-4b65-b308-b998f3fa1eae-kube-api-access-qrxvs\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.722350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.722950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.740732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfe4c263-9750-4b65-b308-b998f3fa1eae-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.741531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.759679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxvs\" (UniqueName: \"kubernetes.io/projected/dfe4c263-9750-4b65-b308-b998f3fa1eae-kube-api-access-qrxvs\") pod \"ovnkube-control-plane-749d76644c-6m7xn\" (UID: \"dfe4c263-9750-4b65-b308-b998f3fa1eae\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.774137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.784175 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.784215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.784226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.784243 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.784256 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.806939 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.819433 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.846528 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.856737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" Oct 06 08:22:56 crc kubenswrapper[4755]: W1006 08:22:56.869665 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe4c263_9750_4b65_b308_b998f3fa1eae.slice/crio-6ab5ae3ead1e348aac0a8dca917d68591e0e3a991b4b65ddf783678216e4b69c WatchSource:0}: Error finding container 6ab5ae3ead1e348aac0a8dca917d68591e0e3a991b4b65ddf783678216e4b69c: Status 404 returned error can't find the container with id 6ab5ae3ead1e348aac0a8dca917d68591e0e3a991b4b65ddf783678216e4b69c Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.878389 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:56 crc kubenswrapper[4755]: E1006 08:22:56.878511 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.878866 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:56 crc kubenswrapper[4755]: E1006 08:22:56.878953 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.879000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:56 crc kubenswrapper[4755]: E1006 08:22:56.879050 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.886484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.886519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.886529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.886546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.886559 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.988962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.989012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.989022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.989039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:56 crc kubenswrapper[4755]: I1006 08:22:56.989050 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:56Z","lastTransitionTime":"2025-10-06T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.092053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.092102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.092115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.092132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.092144 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.194391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.194532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.194544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.194625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.194643 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.237055 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" event={"ID":"dfe4c263-9750-4b65-b308-b998f3fa1eae","Type":"ContainerStarted","Data":"f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.242719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" event={"ID":"dfe4c263-9750-4b65-b308-b998f3fa1eae","Type":"ContainerStarted","Data":"0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.242783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" event={"ID":"dfe4c263-9750-4b65-b308-b998f3fa1eae","Type":"ContainerStarted","Data":"6ab5ae3ead1e348aac0a8dca917d68591e0e3a991b4b65ddf783678216e4b69c"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.264641 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.292907 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.298044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.298099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.298112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.298198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.298212 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.331010 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.347306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.370039 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.387600 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.401445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.401497 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.401509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.401528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.401541 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.404954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.422944 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.439661 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.456799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.476785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.495888 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.504452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.504517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.504535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.504556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.504599 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.518658 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.537103 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.555445 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.573675 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.608678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.608733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.608744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.608763 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.608778 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.712852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.712904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.712914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.712930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.712941 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.816525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.816628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.816645 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.816673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.816712 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.920375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.920415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.920424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.920442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:57 crc kubenswrapper[4755]: I1006 08:22:57.920454 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:57Z","lastTransitionTime":"2025-10-06T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.024476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.024545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.024597 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.024632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.024656 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.127647 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.127697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.127708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.127748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.127760 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.230746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.230826 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.230848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.230886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.230909 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.333867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.334445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.334549 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.334666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.334756 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.438237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.438303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.438320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.438345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.438364 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.444978 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vf9ht"] Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.445695 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.445786 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.465850 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.482812 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.501277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.514617 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.532985 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.540383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm9nn\" (UniqueName: \"kubernetes.io/projected/60fbd235-a60f-436e-9552-e3eaf60f24f3-kube-api-access-bm9nn\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.540543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.543214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.543298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.543323 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.543357 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.543389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.555595 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.568694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.584759 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.598356 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.615230 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.631017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.642381 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.642443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm9nn\" (UniqueName: \"kubernetes.io/projected/60fbd235-a60f-436e-9552-e3eaf60f24f3-kube-api-access-bm9nn\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.642772 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.642973 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:22:59.142931696 +0000 UTC m=+35.972246950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.646117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.646292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.646355 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.646417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.646503 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.651707 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.669098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm9nn\" (UniqueName: \"kubernetes.io/projected/60fbd235-a60f-436e-9552-e3eaf60f24f3-kube-api-access-bm9nn\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.675512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.722859 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.740065 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.749377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.749430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.749439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.749468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.749477 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.755220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.784799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.844503 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.844722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.844818 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.844853 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.844806033 +0000 UTC m=+51.674121247 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.844904 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.844892085 +0000 UTC m=+51.674207299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.844940 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.844984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.845014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845117 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845155 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845167 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845172 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845196 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845234 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845256 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845209 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.845202222 +0000 UTC m=+51.674517436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845313 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.845306025 +0000 UTC m=+51.674621239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.845327 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.845321325 +0000 UTC m=+51.674636539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.852862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.852924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.852941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.852965 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.852981 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.877729 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.877754 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.877901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.878009 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.878102 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:22:58 crc kubenswrapper[4755]: E1006 08:22:58.878260 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.955933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.955994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.956009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.956029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:58 crc kubenswrapper[4755]: I1006 08:22:58.956041 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:58Z","lastTransitionTime":"2025-10-06T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.060373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.060528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.060616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.060649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.060669 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.149950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.150162 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.150292 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:00.150263327 +0000 UTC m=+36.979578581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.163304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.163365 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.163379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.163399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.163415 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.266390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.266439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.266451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.266469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.266481 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.323959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.324008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.324019 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.324038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.324051 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.337134 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.341743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.341849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.341867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.341899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.341912 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.355183 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.359158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.359204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.359215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.359235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.359247 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.376683 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.381704 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.381748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.381758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.381777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.381789 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.395514 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.399938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.399991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.400007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.400030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.400123 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.414305 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:22:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:22:59 crc kubenswrapper[4755]: E1006 08:22:59.414473 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.417503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.417581 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.417594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.417625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.417639 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.520304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.520374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.520393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.520419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.520442 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.623939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.623982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.623999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.624029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.624044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.727182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.727231 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.727240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.727257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.727269 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.831134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.831187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.831201 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.831223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.831236 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.934681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.934746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.934761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.934785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:22:59 crc kubenswrapper[4755]: I1006 08:22:59.934802 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:22:59Z","lastTransitionTime":"2025-10-06T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.038252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.038321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.038335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.038358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.038375 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.141317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.141441 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.141457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.141475 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.141487 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.162728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.162980 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.163116 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:02.163095422 +0000 UTC m=+38.992410636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.244930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.245002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.245024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.245049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.245067 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.349051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.349116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.349131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.349154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.349171 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.453395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.453465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.453479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.453499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.453512 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.556334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.556407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.556425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.556449 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.556471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.659742 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.659783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.659795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.659812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.659824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.763662 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.763728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.763744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.763766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.763787 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.867127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.867173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.867195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.867217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.867231 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.878755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.878805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.878815 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.878755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.878906 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.879023 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.879144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:00 crc kubenswrapper[4755]: E1006 08:23:00.879241 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.971269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.971809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.972240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.972437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:00 crc kubenswrapper[4755]: I1006 08:23:00.973180 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:00Z","lastTransitionTime":"2025-10-06T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.076171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.076238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.076260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.076288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.076312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.179618 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.179670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.179687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.179710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.179727 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.282889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.282971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.282996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.283027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.283050 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.386623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.386694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.386716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.386741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.386762 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.491994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.492106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.492136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.492176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.492204 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.595341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.595427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.595448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.595480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.595501 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.698870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.698916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.698926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.698944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.698957 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.801827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.801869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.801880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.801899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.801912 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.905288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.905340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.905357 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.905375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:01 crc kubenswrapper[4755]: I1006 08:23:01.905387 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:01Z","lastTransitionTime":"2025-10-06T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.010692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.010756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.010774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.010806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.010824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.114330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.114377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.114389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.114411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.114425 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.186321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.186577 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.186679 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:06.186651325 +0000 UTC m=+43.015966539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.217005 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.217061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.217073 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.217095 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.217147 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.320116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.320162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.320171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.320190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.320203 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.422752 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.422813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.422830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.422855 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.422872 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.526499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.526546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.526595 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.526619 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.526634 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.630006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.630051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.630061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.630079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.630089 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.733547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.733687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.733722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.733759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.733784 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.838196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.838275 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.838293 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.838318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.838338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.878710 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.878948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.879133 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.879183 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.879224 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.879394 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.879516 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:02 crc kubenswrapper[4755]: E1006 08:23:02.879692 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.943330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.943394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.943413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.943431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:02 crc kubenswrapper[4755]: I1006 08:23:02.943448 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:02Z","lastTransitionTime":"2025-10-06T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.047673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.047737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.047756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.047785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.047804 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.150708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.150801 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.150837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.150872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.150910 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.254705 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.254769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.254788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.254819 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.254840 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.359864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.359986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.360016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.360048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.360067 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.464306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.464409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.464434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.464466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.464491 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.568166 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.568222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.568235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.568254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.568268 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.671520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.671630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.671657 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.671693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.671716 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.774936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.775013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.775035 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.775060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.775082 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.877939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.877999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.878017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.878039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.878056 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.913966 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.936209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.955161 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.977897 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.981542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.981629 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.981639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.981659 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:03 crc kubenswrapper[4755]: I1006 08:23:03.981669 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:03Z","lastTransitionTime":"2025-10-06T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.004032 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.032747 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.050959 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.077929 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.085747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.085806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.085821 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.085852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.085868 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.103308 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.124156 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.149776 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.176042 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.189025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.189073 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.189085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.189109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.189127 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.200946 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.224093 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.249228 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.264345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.287804 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.291625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.291658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.291667 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.291684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.291695 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.394401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.394480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.394504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.394533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.394555 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.497726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.497788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.497806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.497829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.497845 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.601260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.601332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.601358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.601388 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.601410 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.704589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.704957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.705109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.705228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.705291 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.808297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.808354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.808371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.808397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.808414 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.878761 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.878907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:04 crc kubenswrapper[4755]: E1006 08:23:04.878971 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.879017 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:04 crc kubenswrapper[4755]: E1006 08:23:04.879184 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:04 crc kubenswrapper[4755]: E1006 08:23:04.879318 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.879422 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:04 crc kubenswrapper[4755]: E1006 08:23:04.879671 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.912204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.912312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.912338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.912369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:04 crc kubenswrapper[4755]: I1006 08:23:04.912388 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:04Z","lastTransitionTime":"2025-10-06T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.015174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.015256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.015278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.015310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.015333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.117634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.117672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.117681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.117694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.117703 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.220771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.220835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.220859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.220887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.220908 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.324896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.324978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.325002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.325041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.325068 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.428077 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.428131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.428143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.428162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.428174 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.531005 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.531052 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.531063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.531076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.531086 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.633769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.633822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.633833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.633850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.633866 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.736765 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.736818 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.736831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.736849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.736862 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.840018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.840061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.840073 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.840092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.840104 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.942973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.943017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.943028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.943043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:05 crc kubenswrapper[4755]: I1006 08:23:05.943054 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:05Z","lastTransitionTime":"2025-10-06T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.045198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.045270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.045288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.045312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.045329 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.147909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.147981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.148012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.148042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.148067 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.231252 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.231442 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.231526 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:14.231502726 +0000 UTC m=+51.060817980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.252399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.252463 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.252473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.252494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.252508 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.356120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.356187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.356199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.356223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.356238 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.459012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.459069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.459083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.459101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.459115 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.563161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.563233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.563257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.563286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.563310 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.666983 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.667062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.667083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.667111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.667132 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.771422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.771501 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.771525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.771557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.771665 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.876191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.876269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.876293 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.876320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.876340 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.878638 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.878665 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.878656 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.878634 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.878818 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.878911 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.879156 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:06 crc kubenswrapper[4755]: E1006 08:23:06.879459 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.980578 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.980620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.980631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.980646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:06 crc kubenswrapper[4755]: I1006 08:23:06.980659 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:06Z","lastTransitionTime":"2025-10-06T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.083948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.084208 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.084291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.084378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.084467 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.187524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.187937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.188091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.188235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.188403 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.292607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.293134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.293330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.293520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.293777 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.397764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.397823 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.397842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.397871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.397889 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.500958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.501018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.501036 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.501063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.501083 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.604299 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.604342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.604351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.604373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.604387 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.707198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.707269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.707291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.707316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.707333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.809950 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.810010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.810029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.810053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.810070 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.913746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.913809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.913832 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.913861 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:07 crc kubenswrapper[4755]: I1006 08:23:07.913883 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:07Z","lastTransitionTime":"2025-10-06T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.017227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.017291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.017309 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.017334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.017351 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.121089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.121146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.121158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.121174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.121489 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.225484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.225548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.225599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.225630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.225653 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.329657 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.329905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.329923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.329949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.329966 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.434230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.434281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.434293 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.434311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.434329 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.537022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.537068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.537079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.537094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.537104 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.640284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.640371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.640391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.640416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.640433 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.743863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.743951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.743974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.744005 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.744028 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.848098 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.848157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.848167 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.848189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.848203 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.878807 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:08 crc kubenswrapper[4755]: E1006 08:23:08.878998 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.879094 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.879155 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.879201 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:08 crc kubenswrapper[4755]: E1006 08:23:08.879319 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:08 crc kubenswrapper[4755]: E1006 08:23:08.879788 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:08 crc kubenswrapper[4755]: E1006 08:23:08.880032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.951149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.951214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.951238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.951294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:08 crc kubenswrapper[4755]: I1006 08:23:08.951314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:08Z","lastTransitionTime":"2025-10-06T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.054607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.054676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.054694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.054715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.054735 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.158467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.158519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.158554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.158599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.158613 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.262114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.262199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.262216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.262244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.262261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.365540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.365609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.365621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.365640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.365654 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.437530 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.437607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.437622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.437642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.437657 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.454935 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.460033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.460099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.460120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.460148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.460173 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.478510 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.483311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.483383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.483402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.483430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.483448 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.505099 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.510431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.510494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.510512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.510536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.510554 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.531010 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.536033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.536101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.536125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.536155 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.536182 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.554170 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:09 crc kubenswrapper[4755]: E1006 08:23:09.554431 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.556348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.556403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.556422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.556449 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.556468 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.658982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.659020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.659034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.659052 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.659066 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.762849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.762913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.762933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.762963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.762985 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.866120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.866185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.866203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.866229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.866249 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.969173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.969213 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.969223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.969237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:09 crc kubenswrapper[4755]: I1006 08:23:09.969247 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:09Z","lastTransitionTime":"2025-10-06T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.072291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.072363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.072389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.072416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.072438 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.175517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.175665 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.175686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.175719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.175740 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.279373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.279433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.279448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.279473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.279490 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.381931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.381974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.381984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.381999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.382012 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.485161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.485257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.485278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.485311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.485333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.589591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.589646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.589659 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.589681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.589694 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.693746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.693811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.693828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.693851 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.693864 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.797950 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.798022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.798034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.798057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.798070 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.878703 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.878726 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.879126 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.879171 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:10 crc kubenswrapper[4755]: E1006 08:23:10.879539 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.879627 4755 scope.go:117] "RemoveContainer" containerID="92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4" Oct 06 08:23:10 crc kubenswrapper[4755]: E1006 08:23:10.879772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:10 crc kubenswrapper[4755]: E1006 08:23:10.879927 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:10 crc kubenswrapper[4755]: E1006 08:23:10.880217 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.901178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.901221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.901231 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.901248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:10 crc kubenswrapper[4755]: I1006 08:23:10.901259 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:10Z","lastTransitionTime":"2025-10-06T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.004974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.005024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.005037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.005058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.005075 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.108956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.109278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.109292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.109312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.109330 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.243054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.243129 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.243145 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.243182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.243202 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.305608 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/1.log" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.309552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.310866 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.327015 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.346190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.346553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.346726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.346901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.347033 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.350441 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.387059 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.402891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.421469 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.449806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.449867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.449880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.449906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.449922 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.458964 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.476022 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.492156 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.510943 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.530829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.550629 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.552678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.552746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.552763 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.552782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.552794 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.565654 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.578116 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.596227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.611139 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.626687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.640990 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.655661 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.655746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.655760 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.655787 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.655802 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.758884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.758943 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.758957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.758981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.758996 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.862997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.863065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.863085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.863115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.863136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.966141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.966202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.966220 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.966244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:11 crc kubenswrapper[4755]: I1006 08:23:11.966263 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:11Z","lastTransitionTime":"2025-10-06T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.069937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.070014 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.070031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.070058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.070074 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.174206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.174761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.174868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.174979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.175083 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.279442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.279491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.279505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.279528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.279543 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.314516 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/2.log" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.315617 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/1.log" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.320136 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" exitCode=1 Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.320181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.320230 4755 scope.go:117] "RemoveContainer" containerID="92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.322700 4755 scope.go:117] "RemoveContainer" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" Oct 06 08:23:12 crc kubenswrapper[4755]: E1006 08:23:12.323223 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.343224 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.364896 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.378685 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.383616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.383756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.383847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.383942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.384044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.398642 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.413035 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.426657 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.444035 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.473676 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92dd02555669593699264536ea13632e725595c2195951ca03965e95682ee8b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:22:54Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:22:54.184241 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.486639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.486746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.486768 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.486828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.486848 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.489558 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.506630 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.523348 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.543459 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.572442 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.589803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.589850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.589859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.589876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.589887 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.592137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.610093 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.627069 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.641305 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.692480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.692538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.692549 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.692593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.692609 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.795756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.795824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.795848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.795873 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.795893 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.878472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.878486 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:12 crc kubenswrapper[4755]: E1006 08:23:12.878700 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.878495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.878497 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:12 crc kubenswrapper[4755]: E1006 08:23:12.878869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:12 crc kubenswrapper[4755]: E1006 08:23:12.878988 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:12 crc kubenswrapper[4755]: E1006 08:23:12.879047 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.898525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.898603 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.898617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.898641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:12 crc kubenswrapper[4755]: I1006 08:23:12.898659 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:12Z","lastTransitionTime":"2025-10-06T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.002148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.002199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.002209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.002228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.002240 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.105621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.105688 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.105700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.105723 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.105742 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.208544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.208712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.208732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.208767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.208785 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.311793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.311921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.311988 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.312019 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.312346 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.326262 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/2.log" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.330194 4755 scope.go:117] "RemoveContainer" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" Oct 06 08:23:13 crc kubenswrapper[4755]: E1006 08:23:13.330587 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.352884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.372637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.387165 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.401335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.420316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.420405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.420426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.420456 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.420478 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.424257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.440428 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.458353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.477304 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.494476 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.523943 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.524001 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.524015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.524037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.524054 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.524756 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.538246 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.554653 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.555449 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.568316 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.581488 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.598612 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.611368 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.625985 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.627146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.627188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.627198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.627214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.627224 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.640701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.657759 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.671189 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.686975 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.701509 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.722086 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.730189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.730478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.730635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.730791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.730940 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.742427 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.761289 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.777182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.796100 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.811846 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.828790 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.833492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.833624 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.833644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.833706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.833727 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.845666 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.863194 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.888899 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.905703 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.920699 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.936426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.936469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.936501 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.936517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.936529 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:13Z","lastTransitionTime":"2025-10-06T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.942833 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.958104 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:13 crc kubenswrapper[4755]: I1006 08:23:13.980078 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.011210 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.027691 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.040818 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.040872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.040887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.040913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.040929 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.044446 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.060320 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.072388 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.092471 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.108061 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.127648 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.144717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.144773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.144786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.144807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.144820 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.145149 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.166670 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.193179 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.211871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.234008 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.237311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.237540 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.237684 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:30.237655059 +0000 UTC m=+67.066970273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.248028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.248072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.248085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.248105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.248117 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.251134 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.270520 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.288182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.303255 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.351037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.351184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.351213 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.351246 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.351272 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.453968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.454074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.454092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.454118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.454140 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.557404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.557469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.557486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.557511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.557529 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.661245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.661285 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.661294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.661308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.661317 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.764483 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.764554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.764591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.764610 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.764622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.849456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.849706 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:23:46.84967044 +0000 UTC m=+83.678985664 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.851028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.851287 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.851537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.851848 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.851292 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.852262 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.852607 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.851374 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.851647 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.851921 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.852863 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:46.852838547 +0000 UTC m=+83.682153831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.852950 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.853290 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.853461 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:46.853253777 +0000 UTC m=+83.682569021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.853667 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:46.853646446 +0000 UTC m=+83.682961700 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.853844 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:23:46.85382684 +0000 UTC m=+83.683142084 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.867396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.867672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.867999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.868103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.868180 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.877975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.878007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.877974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.877977 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.878089 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.878450 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.878442 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:14 crc kubenswrapper[4755]: E1006 08:23:14.878523 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.971712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.972025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.972114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.972179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:14 crc kubenswrapper[4755]: I1006 08:23:14.972247 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:14Z","lastTransitionTime":"2025-10-06T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.075437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.075886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.076012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.076152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.076274 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.178865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.178925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.178945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.178974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.178998 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.283262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.283320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.283501 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.283526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.283539 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.386766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.386860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.386885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.386923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.386953 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.491015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.491843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.492027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.492183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.492314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.596156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.596209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.596225 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.596248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.596265 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.699877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.700347 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.700482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.700640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.700770 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.803462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.803523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.803536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.803575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.803589 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.906321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.906389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.906418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.906445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:15 crc kubenswrapper[4755]: I1006 08:23:15.906464 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:15Z","lastTransitionTime":"2025-10-06T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.010881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.011340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.011413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.011486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.011595 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.115393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.115806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.115933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.116020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.116089 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.219781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.219860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.219873 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.219907 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.219922 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.322961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.323029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.323048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.323074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.323096 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.426614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.427062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.427147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.427219 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.427292 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.531205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.531273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.531283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.531304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.531317 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.634533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.634635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.634655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.634684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.634703 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.738162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.738235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.738249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.738276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.738294 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.841002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.841092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.841106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.841130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.841148 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.878608 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.878697 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.878715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.878865 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:16 crc kubenswrapper[4755]: E1006 08:23:16.878858 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:16 crc kubenswrapper[4755]: E1006 08:23:16.878931 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:16 crc kubenswrapper[4755]: E1006 08:23:16.879014 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:16 crc kubenswrapper[4755]: E1006 08:23:16.879087 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.954353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.954432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.954448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.954471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:16 crc kubenswrapper[4755]: I1006 08:23:16.954493 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:16Z","lastTransitionTime":"2025-10-06T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.057917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.057970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.057987 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.058011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.058030 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.161694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.161759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.161774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.161800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.161815 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.264670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.264727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.264742 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.264768 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.264787 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.367609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.367704 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.367730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.367765 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.367791 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.471784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.471854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.471878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.471913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.471937 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.574960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.575008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.575017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.575037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.575047 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.678152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.678188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.678201 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.678221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.678232 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.782237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.782297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.782312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.782331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.782345 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.884532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.884596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.884610 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.884626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.884641 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.988246 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.988294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.988305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.988321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:17 crc kubenswrapper[4755]: I1006 08:23:17.988330 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:17Z","lastTransitionTime":"2025-10-06T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.091725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.091805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.091822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.091852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.091871 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.195560 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.195676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.195697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.195728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.195747 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.298447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.298514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.298532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.298555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.298603 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.402546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.402649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.402668 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.402699 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.402725 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.506086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.506407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.506532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.506706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.506888 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.610859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.610918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.610936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.610961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.610980 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.714270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.714337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.714354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.714382 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.714409 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.817976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.818047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.818060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.818079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.818116 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.877916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.878430 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.878306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.878459 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:18 crc kubenswrapper[4755]: E1006 08:23:18.878639 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:18 crc kubenswrapper[4755]: E1006 08:23:18.878836 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:18 crc kubenswrapper[4755]: E1006 08:23:18.879005 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:18 crc kubenswrapper[4755]: E1006 08:23:18.879250 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.921602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.921683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.921720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.921751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:18 crc kubenswrapper[4755]: I1006 08:23:18.921765 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:18Z","lastTransitionTime":"2025-10-06T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.024503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.024558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.024589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.024606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.024619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.127699 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.127779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.127799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.127833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.127854 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.230808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.230861 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.230872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.230889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.230901 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.334276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.334330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.334344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.334364 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.334376 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.438345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.438423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.438443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.438476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.438495 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.540876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.540958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.540975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.540999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.541017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.645590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.645653 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.645670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.645692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.645710 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.749721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.749804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.749824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.749867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.749890 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.854083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.854150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.854167 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.854191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.854208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.869316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.869396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.869419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.869448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.869470 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: E1006 08:23:19.892414 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.898366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.898446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.898467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.898494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.898511 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: E1006 08:23:19.920485 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.926502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.926592 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.926612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.926637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.926655 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: E1006 08:23:19.949899 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.955814 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.955884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.955903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.955927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.955946 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:19 crc kubenswrapper[4755]: E1006 08:23:19.976638 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.983724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.983787 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.983806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.983837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:19 crc kubenswrapper[4755]: I1006 08:23:19.983860 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:19Z","lastTransitionTime":"2025-10-06T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.005039 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:20Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.005343 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.007384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.007468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.007496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.007527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.007554 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.109889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.109952 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.109970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.109994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.110012 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.212244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.212316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.212325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.212342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.212356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.316487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.316554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.316600 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.316625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.316647 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.419759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.419853 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.419875 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.419906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.419929 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.523308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.523392 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.523411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.523443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.523465 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.627252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.627310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.627326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.627352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.627366 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.730812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.730874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.730886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.730908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.730921 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.834380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.834425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.834435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.834456 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.834467 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.878346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.878381 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.878397 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.878515 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.878514 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.878678 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.878791 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:20 crc kubenswrapper[4755]: E1006 08:23:20.878897 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.938257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.938359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.938376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.938402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:20 crc kubenswrapper[4755]: I1006 08:23:20.938419 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:20Z","lastTransitionTime":"2025-10-06T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.042947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.043016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.043034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.043070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.043086 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.146913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.146977 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.146996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.147030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.147046 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.249945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.250035 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.250056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.250154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.250178 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.354147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.354199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.354209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.354227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.354239 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.457122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.457204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.457224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.457261 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.457282 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.560334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.560391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.560408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.560429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.560443 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.664477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.664546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.664601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.664642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.664659 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.767464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.767533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.767551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.767624 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.767696 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.871776 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.871842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.871862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.871893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.871913 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.978529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.978641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.978674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.978707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:21 crc kubenswrapper[4755]: I1006 08:23:21.978732 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:21Z","lastTransitionTime":"2025-10-06T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.082324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.082376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.082391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.082413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.082427 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.186164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.186253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.186277 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.186306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.186329 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.290090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.290158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.290174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.290197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.290215 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.392541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.392593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.392601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.392614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.392622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.495282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.495347 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.495366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.495391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.495408 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.598340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.598408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.598421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.598439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.598451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.701324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.701392 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.701408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.701432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.701450 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.804622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.804658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.804667 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.804680 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.804692 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.878028 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.878083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.878129 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:22 crc kubenswrapper[4755]: E1006 08:23:22.878172 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.878110 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:22 crc kubenswrapper[4755]: E1006 08:23:22.878310 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:22 crc kubenswrapper[4755]: E1006 08:23:22.878468 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:22 crc kubenswrapper[4755]: E1006 08:23:22.878548 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.907062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.907121 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.907137 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.907162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:22 crc kubenswrapper[4755]: I1006 08:23:22.907188 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:22Z","lastTransitionTime":"2025-10-06T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.010893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.010956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.010975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.010998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.011017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.113672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.113738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.113760 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.113782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.113798 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.216848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.216909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.216926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.216949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.216961 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.320968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.321050 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.321075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.321109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.321136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.423723 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.423802 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.423825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.423854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.423877 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.526808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.526865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.526878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.526899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.526914 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.630975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.631040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.631063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.631136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.631163 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.733800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.733862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.733878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.733901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.733920 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.836332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.836390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.836408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.836430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.836447 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.879218 4755 scope.go:117] "RemoveContainer" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" Oct 06 08:23:23 crc kubenswrapper[4755]: E1006 08:23:23.879529 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.904076 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.920933 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939476 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:23Z","lastTransitionTime":"2025-10-06T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.939503 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.959957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.977281 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:23 crc kubenswrapper[4755]: I1006 08:23:23.993855 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.008062 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.029665 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.042003 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.042057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.042072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.042094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.042109 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.055649 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.084773 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.100221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.118749 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.146864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.146968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.146986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.147009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.147024 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.152227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.173658 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.194142 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.209852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.225490 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.242985 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.249686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.249780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.249809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.249849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.249880 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.352660 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.352701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.352714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.352730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.352742 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.456855 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.456962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.456991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.457032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.457059 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.559825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.559897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.559913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.559945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.559963 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.662038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.662080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.662090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.662107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.662121 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.764734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.764799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.764882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.764917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.764942 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.867928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.868051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.868066 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.868086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.868102 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.878976 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.879042 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:24 crc kubenswrapper[4755]: E1006 08:23:24.879106 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.879154 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.879177 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:24 crc kubenswrapper[4755]: E1006 08:23:24.879324 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:24 crc kubenswrapper[4755]: E1006 08:23:24.879699 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:24 crc kubenswrapper[4755]: E1006 08:23:24.879544 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.971733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.971794 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.971812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.971838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:24 crc kubenswrapper[4755]: I1006 08:23:24.971852 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:24Z","lastTransitionTime":"2025-10-06T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.075415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.075456 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.075468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.075484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.075496 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.178325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.178401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.178426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.178452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.178466 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.280779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.280829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.280839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.280860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.280872 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.382542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.382607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.382620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.382636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.382648 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.488429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.488500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.488519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.488546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.488600 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.592507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.592599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.592622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.592646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.592665 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.696216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.696272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.696291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.696317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.696340 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.799978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.800042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.800062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.800091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.800110 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.906020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.906070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.906087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.906107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:25 crc kubenswrapper[4755]: I1006 08:23:25.906122 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:25Z","lastTransitionTime":"2025-10-06T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.008538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.008653 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.008673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.008703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.008725 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.110781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.110839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.110856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.110880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.110902 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.214222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.214272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.214284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.214301 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.214313 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.316849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.316918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.316942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.316971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.316995 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.419798 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.419933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.419956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.419987 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.420009 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.523278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.523326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.523344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.523370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.523387 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.626764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.626813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.626830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.626856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.626874 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.730829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.730897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.730920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.730950 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.730975 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.834180 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.834239 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.834262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.834292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.834316 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.878006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.878089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.878011 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.878089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:26 crc kubenswrapper[4755]: E1006 08:23:26.878228 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:26 crc kubenswrapper[4755]: E1006 08:23:26.878419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:26 crc kubenswrapper[4755]: E1006 08:23:26.878525 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:26 crc kubenswrapper[4755]: E1006 08:23:26.878735 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.937604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.937731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.937741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.937756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:26 crc kubenswrapper[4755]: I1006 08:23:26.937765 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:26Z","lastTransitionTime":"2025-10-06T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.040435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.040498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.040519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.040546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.040593 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.143836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.143884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.143895 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.143913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.143926 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.247929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.248107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.248216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.248304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.248333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.352350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.352417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.352437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.352462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.352478 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.482677 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.482778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.482810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.482847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.482873 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.586743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.586807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.586817 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.586839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.586852 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.690057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.690121 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.690144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.690174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.690196 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.793369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.793461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.793479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.793508 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.793526 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.895665 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.895718 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.895733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.895756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.895770 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.998686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.998730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.998740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.998753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:27 crc kubenswrapper[4755]: I1006 08:23:27.998779 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:27Z","lastTransitionTime":"2025-10-06T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.101344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.101387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.101397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.101412 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.101451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.203833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.203881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.203891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.203907 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.203918 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.307075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.307157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.307183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.307218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.307425 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.410633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.410687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.410697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.410712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.410737 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.512772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.512854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.512875 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.512906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.512928 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.615661 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.615719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.615733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.615756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.615772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.718818 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.718868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.718880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.718902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.718916 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.821537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.821627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.821646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.821672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.821693 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.877775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.877799 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.877847 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.877841 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:28 crc kubenswrapper[4755]: E1006 08:23:28.878000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:28 crc kubenswrapper[4755]: E1006 08:23:28.878091 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:28 crc kubenswrapper[4755]: E1006 08:23:28.878243 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:28 crc kubenswrapper[4755]: E1006 08:23:28.878338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.924756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.924834 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.924887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.924926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:28 crc kubenswrapper[4755]: I1006 08:23:28.924951 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:28Z","lastTransitionTime":"2025-10-06T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.028332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.028385 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.028404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.028424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.028438 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.131408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.131498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.131532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.131602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.131634 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.234808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.234874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.234886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.234905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.234917 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.338823 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.338874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.338889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.338912 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.338927 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.441956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.442010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.442021 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.442043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.442056 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.545075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.545142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.545157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.545178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.545190 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.651750 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.651796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.651811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.651832 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.651849 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.755929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.756006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.756018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.756038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.756052 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.859296 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.859351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.859366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.859390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.859405 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.962605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.962666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.962694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.962716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:29 crc kubenswrapper[4755]: I1006 08:23:29.962730 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:29Z","lastTransitionTime":"2025-10-06T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.065401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.065491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.065503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.065520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.065531 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.169681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.169761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.169787 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.169815 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.169832 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.200792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.200857 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.200872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.200898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.200919 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.215828 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.221075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.221132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.221147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.221173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.221182 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.237325 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.247317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.247383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.247396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.247440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.247457 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.262893 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.267107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.267139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.267151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.267171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.267183 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.280848 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.285130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.285158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.285168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.285181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.285190 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.303387 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.303556 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.305787 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.305921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.305948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.305981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.306002 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.334148 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.334416 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.334530 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:02.334492871 +0000 UTC m=+99.163808265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.408440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.408521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.408535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.408587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.408599 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.510784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.510844 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.510860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.510883 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.510898 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.613188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.613250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.613268 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.613295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.613314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.716249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.716326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.716345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.716373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.716435 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.819635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.819688 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.819797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.819822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.819837 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.878051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.878102 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.878144 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.878190 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.878263 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.878430 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.878634 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:30 crc kubenswrapper[4755]: E1006 08:23:30.878772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.923316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.923368 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.923376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.923393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:30 crc kubenswrapper[4755]: I1006 08:23:30.923404 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:30Z","lastTransitionTime":"2025-10-06T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.026373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.026444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.026467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.026496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.026515 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.129148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.129187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.129195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.129208 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.129218 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.231667 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.231719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.231730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.231747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.231760 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.334646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.334681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.334700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.334717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.334729 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.398587 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/0.log" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.398669 4755 generic.go:334] "Generic (PLEG): container finished" podID="891dff9a-4752-4022-83fc-51f626c76991" containerID="316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa" exitCode=1 Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.398731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerDied","Data":"316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.399373 4755 scope.go:117] "RemoveContainer" containerID="316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.411949 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.428831 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.438969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.439006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.439018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.439034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.439044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.444677 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.460210 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.473386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.487913 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.503934 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.519227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.538118 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.542052 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.542106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.542117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.542141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.542156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.566581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.580099 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.593041 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.613315 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.626698 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.644196 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.645934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.645994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.646007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.646030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.646044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.657743 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.669901 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.682198 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.748542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.748606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.748615 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.748628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.748639 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.851369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.851427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.851437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.851461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.851472 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.953181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.953232 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.953247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.953270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:31 crc kubenswrapper[4755]: I1006 08:23:31.953288 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:31Z","lastTransitionTime":"2025-10-06T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.056661 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.056722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.056739 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.056766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.056785 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.160055 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.160121 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.160139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.160162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.160176 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.263591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.263651 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.263665 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.263686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.263696 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.366185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.366259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.366279 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.366305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.366324 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.408539 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/0.log" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.408693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerStarted","Data":"252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.431755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.453114 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469420 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469647 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469668 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.469681 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.482023 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.497623 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.510364 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.525553 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.540227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.560460 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.572872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.572967 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.573013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.573041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.573058 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.575271 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.590336 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.612063 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.626547 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.645245 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.655900 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.672632 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.675295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.675342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.675353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.675370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.675382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.688297 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.703446 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.778075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.778126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.778138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.778157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.778168 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.878901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.878975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.879020 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.878901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:32 crc kubenswrapper[4755]: E1006 08:23:32.879083 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:32 crc kubenswrapper[4755]: E1006 08:23:32.879257 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:32 crc kubenswrapper[4755]: E1006 08:23:32.879341 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:32 crc kubenswrapper[4755]: E1006 08:23:32.879424 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.881344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.881389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.881404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.881426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.881439 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.984397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.984443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.984454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.984476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:32 crc kubenswrapper[4755]: I1006 08:23:32.984488 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:32Z","lastTransitionTime":"2025-10-06T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.088212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.088267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.088281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.088304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.088315 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.191505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.191559 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.191588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.191608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.191620 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.294348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.294403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.294413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.294434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.294446 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.397711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.397796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.397821 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.397854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.397891 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.501962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.502025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.502037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.502061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.502079 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.605827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.605897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.605916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.605949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.605969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.708980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.709043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.709059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.709081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.709095 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.811617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.811676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.811693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.811716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.811732 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.903330 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.917447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.917503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.917521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.917544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.917557 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:33Z","lastTransitionTime":"2025-10-06T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.920227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.936288 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.952766 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.977187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:33 crc kubenswrapper[4755]: I1006 08:23:33.995954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.018436 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.019643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.019686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.019704 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.019732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.019753 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.036052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.052826 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.070994 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.086653 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.103376 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.115211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.122008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.122056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.122070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.122087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.122100 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.133907 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.151630 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.178387 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.193499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.209883 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.224914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.224963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.224979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.225003 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.225017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.329668 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.329719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.329733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.329767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.329787 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.433062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.433118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.433131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.433152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.433166 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.537214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.537283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.537300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.537357 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.537381 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.641221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.641278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.641295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.641321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.641338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.744161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.744216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.744226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.744249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.744261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.847644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.847700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.847713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.847735 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.847747 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.878548 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.878642 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.878712 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:34 crc kubenswrapper[4755]: E1006 08:23:34.878742 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:34 crc kubenswrapper[4755]: E1006 08:23:34.878899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.878973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:34 crc kubenswrapper[4755]: E1006 08:23:34.879032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:34 crc kubenswrapper[4755]: E1006 08:23:34.879158 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.950985 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.951066 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.951079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.951105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:34 crc kubenswrapper[4755]: I1006 08:23:34.951120 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:34Z","lastTransitionTime":"2025-10-06T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.055172 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.055239 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.055257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.055283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.055299 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.158774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.158878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.158897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.158933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.158954 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.262277 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.262360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.262376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.262408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.262427 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.365383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.365447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.365459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.365484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.365498 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.469707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.469797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.469817 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.469854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.469876 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.573740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.573799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.573810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.573832 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.573845 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.676773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.676852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.676871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.676902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.676925 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.780350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.780417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.780427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.780450 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.780464 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.883356 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.883409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.883421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.883443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.883462 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.986281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.986327 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.986339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.986361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:35 crc kubenswrapper[4755]: I1006 08:23:35.986376 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:35Z","lastTransitionTime":"2025-10-06T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.089932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.090009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.090034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.090063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.090081 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.193161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.193237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.193255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.193286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.193305 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.297618 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.297717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.297747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.297777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.297798 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.401627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.401697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.401713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.401738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.401755 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.504688 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.504785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.504812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.504847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.504869 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.610724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.610795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.610816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.610845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.610865 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.713945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.714007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.714032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.714059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.714075 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.818662 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.818743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.818771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.818803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.818820 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.878783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.878841 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:36 crc kubenswrapper[4755]: E1006 08:23:36.878956 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.879018 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:36 crc kubenswrapper[4755]: E1006 08:23:36.879147 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.879205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:36 crc kubenswrapper[4755]: E1006 08:23:36.879346 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:36 crc kubenswrapper[4755]: E1006 08:23:36.879593 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.922651 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.922719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.922736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.922764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:36 crc kubenswrapper[4755]: I1006 08:23:36.922781 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:36Z","lastTransitionTime":"2025-10-06T08:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.026694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.026767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.026785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.026813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.026836 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.129873 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.129962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.129985 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.130019 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.130043 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.233475 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.233530 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.233543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.233580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.233596 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.336676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.336732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.336742 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.336759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.336770 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.439715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.439767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.439781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.439802 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.439820 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.544235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.544319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.544344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.544432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.544463 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.647675 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.647720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.647730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.647747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.647758 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.751712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.751811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.751842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.751880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.751910 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.855424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.855492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.855507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.855529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.855544 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.879918 4755 scope.go:117] "RemoveContainer" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.959250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.959863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.959901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.959923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:37 crc kubenswrapper[4755]: I1006 08:23:37.959938 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:37Z","lastTransitionTime":"2025-10-06T08:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.063683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.063758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.063772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.063795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.063807 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.167211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.167283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.167303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.167335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.167353 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.270603 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.270654 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.270667 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.270692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.270705 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.374146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.374204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.374220 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.374244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.374258 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.434361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/2.log" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.437275 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.437811 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.460892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.478352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.478405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.478423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.478447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.478459 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.482680 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.510545 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.528626 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.549145 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.578894 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.580942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.580997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.581008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.581038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.581052 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.594476 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.610272 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.620929 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.634998 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.649532 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.663801 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.679131 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.684146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.684211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.684227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.684252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.684266 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.693125 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.704906 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.720173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.732495 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.742000 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.787994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.788067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.788086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.788112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.788135 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.878704 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.878705 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:38 crc kubenswrapper[4755]: E1006 08:23:38.878895 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.878704 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.878705 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:38 crc kubenswrapper[4755]: E1006 08:23:38.879035 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:38 crc kubenswrapper[4755]: E1006 08:23:38.879115 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:38 crc kubenswrapper[4755]: E1006 08:23:38.879234 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.891041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.891096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.891112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.891131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.891144 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.994289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.994327 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.994336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.994354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:38 crc kubenswrapper[4755]: I1006 08:23:38.994365 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:38Z","lastTransitionTime":"2025-10-06T08:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.097174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.097269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.097288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.097309 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.097324 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.201128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.201184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.201198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.201219 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.201237 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.305755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.305837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.305860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.305888 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.305908 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.409395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.409440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.409455 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.409474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.409486 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.443806 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/3.log" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.444803 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/2.log" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.450953 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" exitCode=1 Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.451018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.451072 4755 scope.go:117] "RemoveContainer" containerID="e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.451919 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:23:39 crc kubenswrapper[4755]: E1006 08:23:39.452152 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.478320 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.497807 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.513462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.513545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.513603 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.513640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.513662 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.520124 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.539207 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.561522 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.583515 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.595161 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.614800 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.617138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.617322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.617462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.617548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.617606 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.634517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.650799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.677247 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.695058 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.713254 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.722518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.722602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.722620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.722646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.722660 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.726493 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.745590 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.763343 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.791200 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e032303778ef147d2013878ba1f7f8fe2d39fda711282ea31f8b633adb818e47\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:11Z\\\",\\\"message\\\":\\\"60} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:11Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:23:11.935977 6379 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935981 6379 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-mh26r\\\\nI1006 08:23:11.935987 6379 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1006 08:23:11.935986 6379 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 08:23:11.935994 6379 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1006 08:23:11.936001 6379 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1006 08:23:11.936007 6379 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI1006 08:23:38.861216 6733 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861627 6733 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861886 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:23:38.861949 6733 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 08:23:38.861963 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:23:38.861980 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:23:38.861991 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:23:38.862018 6733 factory.go:656] Stopping watch factory\\\\nI1006 08:23:38.862044 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:23:38.862055 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:23:38.862069 6733 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.809595 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.826552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.826658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.826683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.826716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.826735 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.930133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.930183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.930203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.930229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:39 crc kubenswrapper[4755]: I1006 08:23:39.930278 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:39Z","lastTransitionTime":"2025-10-06T08:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.033997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.034105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.034161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.034191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.034246 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.140664 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.140833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.140857 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.140920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.140940 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.245394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.245534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.245634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.245739 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.245847 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.350191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.350236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.350254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.350280 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.350297 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.454415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.454504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.454533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.454571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.454595 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.458498 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/3.log" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.465248 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.465542 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.483430 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.505756 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.523096 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.543794 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.560505 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.564704 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.569542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.569636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.569654 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.569686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.569704 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.578497 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.581699 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.585833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.585894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.585909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.585930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.585960 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.595703 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.605057 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.609694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.609956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.609993 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.610740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.610820 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.611099 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.622838 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.624847 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.629552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.629607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.629622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.629643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.629657 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.638772 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.644828 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.645495 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.649252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.649315 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.649336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.649366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.649387 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.653236 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.671653 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.689191 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.709042 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI1006 08:23:38.861216 6733 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861627 6733 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861886 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:23:38.861949 6733 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 08:23:38.861963 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:23:38.861980 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:23:38.861991 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:23:38.862018 6733 factory.go:656] Stopping watch factory\\\\nI1006 08:23:38.862044 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:23:38.862055 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:23:38.862069 6733 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.720437 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.733691 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.752493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.752555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.752601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.752650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.752796 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.755165 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.772897 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.856364 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.856428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.856450 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.856479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.856503 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.877836 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.877887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.877944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.877843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.878090 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.878230 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.878334 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:40 crc kubenswrapper[4755]: E1006 08:23:40.878408 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.960982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.961172 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.961198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.961260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:40 crc kubenswrapper[4755]: I1006 08:23:40.961359 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:40Z","lastTransitionTime":"2025-10-06T08:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.064645 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.064692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.064704 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.064721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.064734 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.167913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.168021 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.168041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.168107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.168128 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.271074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.271140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.271160 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.271184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.271201 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.374543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.374626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.374642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.374671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.374684 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.477379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.477420 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.477432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.477451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.477465 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.580926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.581042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.581059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.581082 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.581131 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.684499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.684567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.684584 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.684630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.684657 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.787525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.787622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.787641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.787670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.787691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.891006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.891066 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.891084 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.891109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.891128 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.997762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.997830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.997848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.997875 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:41 crc kubenswrapper[4755]: I1006 08:23:41.997893 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:41Z","lastTransitionTime":"2025-10-06T08:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.101970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.102043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.102061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.102089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.102136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.206408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.206489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.206508 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.206539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.206559 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.310478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.310995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.311152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.311331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.311998 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.416298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.416362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.416380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.416408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.416428 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.520960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.521031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.521054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.521086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.521108 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.625516 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.625658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.625684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.625715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.625735 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.729147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.729221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.729240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.729267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.729287 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.833705 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.833760 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.833775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.833791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.833804 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.878826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.878906 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.878954 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.878839 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:42 crc kubenswrapper[4755]: E1006 08:23:42.879128 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:42 crc kubenswrapper[4755]: E1006 08:23:42.879020 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:42 crc kubenswrapper[4755]: E1006 08:23:42.879320 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:42 crc kubenswrapper[4755]: E1006 08:23:42.879356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.937007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.937084 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.937103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.937131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:42 crc kubenswrapper[4755]: I1006 08:23:42.937157 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:42Z","lastTransitionTime":"2025-10-06T08:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.040044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.040118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.040143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.040173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.040196 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.144054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.144123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.144193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.144223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.144241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.247977 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.248069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.248086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.248106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.248118 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.351954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.352022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.352038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.352068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.352087 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.455713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.455790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.455806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.455827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.455845 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.559002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.559071 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.559095 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.559122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.559140 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.661892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.661972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.661986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.662002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.662013 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.765668 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.765730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.765751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.765965 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.765992 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.871683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.871805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.871829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.871864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.871906 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.907804 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.939098 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI1006 08:23:38.861216 6733 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861627 6733 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861886 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:23:38.861949 6733 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 08:23:38.861963 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:23:38.861980 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:23:38.861991 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:23:38.862018 6733 factory.go:656] Stopping watch factory\\\\nI1006 08:23:38.862044 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:23:38.862055 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:23:38.862069 6733 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.950794 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.963841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.975355 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.975436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.975450 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.975469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.975482 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:43Z","lastTransitionTime":"2025-10-06T08:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:43 crc kubenswrapper[4755]: I1006 08:23:43.982835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.003973 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.034692 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.049089 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.064631 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.078479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.078520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.078529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.078546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.078557 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.079228 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.093009 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.112114 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.124240 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.138659 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.148265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.163512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.177738 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.182039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.182115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.182139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.182176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.182201 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.191229 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.285955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.286031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.286051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.286081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.286104 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.389676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.390030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.390222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.390386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.390533 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.493630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.493693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.493706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.493729 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.493744 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.596044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.596087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.596097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.596112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.596123 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.700487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.700542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.700555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.700605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.700628 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.803373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.803421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.803436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.803453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.803463 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.878519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:44 crc kubenswrapper[4755]: E1006 08:23:44.878927 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.879040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.880010 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.880021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:44 crc kubenswrapper[4755]: E1006 08:23:44.884767 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:44 crc kubenswrapper[4755]: E1006 08:23:44.885039 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:44 crc kubenswrapper[4755]: E1006 08:23:44.885498 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.907179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.907270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.907290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.907322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:44 crc kubenswrapper[4755]: I1006 08:23:44.907343 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:44Z","lastTransitionTime":"2025-10-06T08:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.010407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.010457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.010467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.010486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.010499 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.113769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.113839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.113853 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.113881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.113897 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.217078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.217123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.217134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.217153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.217164 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.320524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.320627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.320643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.320670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.320687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.423387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.423457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.423471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.423493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.423511 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.526045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.526098 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.526108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.526124 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.526133 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.631044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.631100 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.631111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.631134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.631146 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.733623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.733669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.733683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.733706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.733720 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.836503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.836553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.836580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.836603 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.836624 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.939678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.939747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.939761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.939811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:45 crc kubenswrapper[4755]: I1006 08:23:45.939827 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:45Z","lastTransitionTime":"2025-10-06T08:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.042524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.042639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.042654 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.042680 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.042698 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.145829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.145871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.145881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.145899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.145910 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.249224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.249271 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.249284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.249305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.249320 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.352845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.352901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.352916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.352939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.352955 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.455878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.455931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.455944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.455969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.455982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.559894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.559947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.559961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.559982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.559999 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.663289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.663346 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.663360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.663379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.663392 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.766808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.766865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.766880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.766899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.766912 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.871037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.871099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.871110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.871132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.871145 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.877853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.877930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.878217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.878216 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.878270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.878443 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.878640 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.878752 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.935803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936069 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.936020107 +0000 UTC m=+147.765335361 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.936186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.936301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.936355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.936429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936442 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936502 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936530 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936531 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936544 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936558 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936614 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936629 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.936610541 +0000 UTC m=+147.765925755 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936505 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936716 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.936690534 +0000 UTC m=+147.766005788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936757 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.936741656 +0000 UTC m=+147.766056920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: E1006 08:23:46.936788 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.936770746 +0000 UTC m=+147.766086030 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.974235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.974288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.974298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.974319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:46 crc kubenswrapper[4755]: I1006 08:23:46.974332 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:46Z","lastTransitionTime":"2025-10-06T08:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.078425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.078504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.078528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.078555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.078618 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.181766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.181838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.181853 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.181882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.181897 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.284964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.285032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.285044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.285081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.285093 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.388419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.388507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.388526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.388556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.388608 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.494976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.495027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.495041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.495117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.495137 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.598995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.599049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.599057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.599074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.599085 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.702756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.703181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.703191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.703207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.703217 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.805671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.805750 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.805766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.805792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.805810 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.907903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.907961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.907978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.908000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:47 crc kubenswrapper[4755]: I1006 08:23:47.908018 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:47Z","lastTransitionTime":"2025-10-06T08:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.012322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.012447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.012482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.012521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.012543 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.115881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.115927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.115939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.115957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.115968 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.218502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.218546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.218558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.218591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.218603 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.322474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.322526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.322539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.322587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.322605 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.426363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.426421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.426435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.426458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.426476 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.529678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.529761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.529778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.529802 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.529816 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.632867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.632975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.632997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.633071 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.633090 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.736424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.736498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.736512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.736532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.736547 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.840465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.840548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.840622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.840660 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.840686 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.878577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.878613 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.878630 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:48 crc kubenswrapper[4755]: E1006 08:23:48.878752 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.878777 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:48 crc kubenswrapper[4755]: E1006 08:23:48.878845 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:48 crc kubenswrapper[4755]: E1006 08:23:48.879100 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:48 crc kubenswrapper[4755]: E1006 08:23:48.879186 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.943531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.943603 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.943613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.943629 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:48 crc kubenswrapper[4755]: I1006 08:23:48.943643 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:48Z","lastTransitionTime":"2025-10-06T08:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.047285 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.047338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.047352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.047373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.047389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.150941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.150990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.151002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.151022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.151033 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.254290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.254337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.254350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.254373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.254387 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.357491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.357543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.357555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.357601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.357619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.461128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.461210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.461230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.461253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.461267 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.565354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.565512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.565616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.565705 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.565772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.673758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.673834 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.673854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.673882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.673909 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.778187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.778237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.778250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.778273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.778287 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.883370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.883462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.883554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.883707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.883814 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.986380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.986430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.986442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.986460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:49 crc kubenswrapper[4755]: I1006 08:23:49.986471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:49Z","lastTransitionTime":"2025-10-06T08:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.089257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.089315 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.089325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.089343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.089355 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.192793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.192864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.192885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.192911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.192930 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.296699 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.296741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.296759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.296779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.296790 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.399106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.399183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.399202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.399235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.399259 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.503035 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.503102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.503120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.503146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.503171 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.606022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.606106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.606141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.606172 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.606194 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.709749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.709814 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.709826 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.709847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.709863 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.813344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.813406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.813417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.813441 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.813456 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.878004 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.878029 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:50 crc kubenswrapper[4755]: E1006 08:23:50.878323 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.878197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:50 crc kubenswrapper[4755]: E1006 08:23:50.878510 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.878158 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:50 crc kubenswrapper[4755]: E1006 08:23:50.878737 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:50 crc kubenswrapper[4755]: E1006 08:23:50.878849 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.917726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.917810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.917833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.917864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.917884 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.980361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.980451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.980470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.980499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:50 crc kubenswrapper[4755]: I1006 08:23:50.980518 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:50Z","lastTransitionTime":"2025-10-06T08:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.002652 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:50Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.007532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.007610 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.007690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.007713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.007724 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.024770 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.031900 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.031960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.031975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.032001 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.032015 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.049842 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.055092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.055145 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.055159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.055183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.055198 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.074481 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.080696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.080775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.080801 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.080835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.080857 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.112864 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:51Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:51 crc kubenswrapper[4755]: E1006 08:23:51.113321 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.116228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.116301 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.116328 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.116366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.116407 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.219722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.219816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.219836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.219867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.219887 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.323177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.323259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.323284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.323317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.323338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.427363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.427427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.427440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.427461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.427474 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.531031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.531080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.531090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.531111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.531123 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.634813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.634871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.634883 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.634900 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.634910 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.740248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.740295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.740306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.740332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.740345 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.843306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.843360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.843370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.843392 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.843405 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.946039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.946096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.946113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.946138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:51 crc kubenswrapper[4755]: I1006 08:23:51.946155 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:51Z","lastTransitionTime":"2025-10-06T08:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.050650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.050725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.050745 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.050771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.050786 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.153141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.153198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.153211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.153253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.153264 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.257333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.257403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.257423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.257450 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.257475 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.361421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.361512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.361535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.361614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.361652 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.465544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.465635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.465649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.465671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.465687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.569103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.569175 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.569189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.569214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.569231 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.671979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.672029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.672039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.672062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.672077 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.775589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.775642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.775652 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.775673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.775687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878230 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878256 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878315 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:52 crc kubenswrapper[4755]: E1006 08:23:52.878462 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:52 crc kubenswrapper[4755]: E1006 08:23:52.878623 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:52 crc kubenswrapper[4755]: E1006 08:23:52.878759 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878858 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: E1006 08:23:52.878917 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.878944 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.982877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.982954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.982987 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.983028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:52 crc kubenswrapper[4755]: I1006 08:23:52.983053 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:52Z","lastTransitionTime":"2025-10-06T08:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.086704 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.086785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.086809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.086840 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.086867 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.190998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.191435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.191500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.191615 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.191699 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.295856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.295945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.295972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.296009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.296047 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.399928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.400589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.400786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.400934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.401080 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.505855 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.505933 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.505947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.505975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.505989 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.610164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.610277 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.610308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.610348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.610375 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.713715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.713805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.713823 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.713852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.713872 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.817720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.817788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.817809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.817842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.817862 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.903885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.917181 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.921626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.921702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.921722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.921749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.921768 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:53Z","lastTransitionTime":"2025-10-06T08:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.932038 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.944875 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.959080 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:53 crc kubenswrapper[4755]: I1006 08:23:53.980378 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:53Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.010916 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.024291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.024341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.024351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.024369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.024381 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.037141 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.072147 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.086376 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.102780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.114996 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.126996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.127046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.127057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.127081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.127095 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.127665 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.143529 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.158988 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.179559 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI1006 08:23:38.861216 6733 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861627 6733 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861886 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:23:38.861949 6733 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 08:23:38.861963 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:23:38.861980 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:23:38.861991 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:23:38.862018 6733 factory.go:656] Stopping watch factory\\\\nI1006 08:23:38.862044 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:23:38.862055 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:23:38.862069 6733 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.191708 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.204363 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:23:54Z is after 2025-08-24T17:21:41Z" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.230317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.230370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.230383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.230407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.230424 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.334018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.334068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.334081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.334101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.334116 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.437911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.437978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.437995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.438017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.438031 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.541366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.541426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.541437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.541458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.541471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.644513 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.644595 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.644607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.644666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.644680 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.748191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.748250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.748260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.748286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.748297 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.852018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.852075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.852089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.852109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.852125 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.878085 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:54 crc kubenswrapper[4755]: E1006 08:23:54.878315 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.878648 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:54 crc kubenswrapper[4755]: E1006 08:23:54.878737 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.878821 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.878992 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:54 crc kubenswrapper[4755]: E1006 08:23:54.879022 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:54 crc kubenswrapper[4755]: E1006 08:23:54.879179 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.955356 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.955442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.955463 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.955492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:54 crc kubenswrapper[4755]: I1006 08:23:54.955512 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:54Z","lastTransitionTime":"2025-10-06T08:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.058432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.058508 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.058523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.058551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.058609 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.161983 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.162060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.162081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.162111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.162131 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.266338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.266418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.266511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.266630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.266673 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.370833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.370908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.370936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.370970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.370992 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.473875 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.473967 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.473984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.474011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.474029 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.577877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.577944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.577957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.577980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.577995 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.681795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.681878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.681908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.681937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.681956 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.785086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.785156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.785170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.785195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.785210 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.879779 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:23:55 crc kubenswrapper[4755]: E1006 08:23:55.880205 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.888338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.888468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.888499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.888533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.888556 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.992007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.992056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.992068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.992091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:55 crc kubenswrapper[4755]: I1006 08:23:55.992103 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:55Z","lastTransitionTime":"2025-10-06T08:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.096663 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.096753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.096766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.096786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.096799 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.201047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.201128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.201142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.201164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.201178 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.305169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.305215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.305223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.305242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.305258 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.408514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.408636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.408662 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.408700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.408720 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.516467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.516523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.516540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.516582 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.516597 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.620382 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.620425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.620435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.620451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.620461 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.724551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.724658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.724678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.724707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.724766 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.828630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.828707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.828731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.828769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.828794 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.878853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:56 crc kubenswrapper[4755]: E1006 08:23:56.879194 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.879929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.880303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.880368 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:56 crc kubenswrapper[4755]: E1006 08:23:56.880471 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:56 crc kubenswrapper[4755]: E1006 08:23:56.880912 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:56 crc kubenswrapper[4755]: E1006 08:23:56.881124 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.900118 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.932870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.932968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.932988 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.933013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:56 crc kubenswrapper[4755]: I1006 08:23:56.933035 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:56Z","lastTransitionTime":"2025-10-06T08:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.037184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.037242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.037260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.037282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.037297 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.141689 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.141764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.141785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.141818 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.141839 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.245226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.245286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.245298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.245321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.245350 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.349977 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.350070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.350087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.350109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.350136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.455398 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.455468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.455482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.455507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.455521 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.558478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.558555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.558612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.558644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.558671 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.662971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.663070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.663092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.663126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.663148 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.766510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.766631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.766650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.766679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.766698 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.871466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.871606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.871635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.871672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.871699 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.975435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.975517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.975546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.975929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:57 crc kubenswrapper[4755]: I1006 08:23:57.975972 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:57Z","lastTransitionTime":"2025-10-06T08:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.079048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.079118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.079131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.079170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.079182 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.182805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.182868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.182881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.182902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.182917 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.286409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.286501 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.286531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.286622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.286655 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.390358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.390435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.390458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.390542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.390641 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.494270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.494340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.494362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.494394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.494416 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.598557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.598681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.598700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.598732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.598754 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.702689 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.702769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.702815 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.702851 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.702874 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.806955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.807015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.807033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.807054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.807066 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.885118 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.885220 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:23:58 crc kubenswrapper[4755]: E1006 08:23:58.886481 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.887093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.887120 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:23:58 crc kubenswrapper[4755]: E1006 08:23:58.887439 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:23:58 crc kubenswrapper[4755]: E1006 08:23:58.888227 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:23:58 crc kubenswrapper[4755]: E1006 08:23:58.888829 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.911002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.911070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.911091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.911117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:58 crc kubenswrapper[4755]: I1006 08:23:58.911138 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:58Z","lastTransitionTime":"2025-10-06T08:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.013655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.013703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.013720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.013744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.013762 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.116710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.116759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.116781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.116806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.116819 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.220262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.220323 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.220339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.220364 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.220382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.323898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.323972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.323986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.324010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.324027 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.427207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.427279 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.427292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.427315 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.427329 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.531330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.531417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.531442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.531482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.531504 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.635438 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.635536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.635560 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.635638 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.635667 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.739528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.739618 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.739634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.739658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.739673 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.842453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.842552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.842579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.842599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.842609 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.945330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.945418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.945436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.945471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:23:59 crc kubenswrapper[4755]: I1006 08:23:59.945488 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:23:59Z","lastTransitionTime":"2025-10-06T08:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.049512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.049623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.049642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.049669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.049690 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.153339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.153440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.153464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.153500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.153525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.257179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.257251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.257268 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.257295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.257312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.359939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.360017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.360038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.360070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.360094 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.463625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.463737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.463757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.463782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.463803 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.566869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.566941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.566966 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.566999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.567023 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.671247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.671318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.671338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.671369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.671394 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.774506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.774546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.774556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.774589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.774599 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877791 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877753 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877862 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877791 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.877937 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:00 crc kubenswrapper[4755]: E1006 08:24:00.877956 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:00 crc kubenswrapper[4755]: E1006 08:24:00.878073 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:00 crc kubenswrapper[4755]: E1006 08:24:00.878154 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:00 crc kubenswrapper[4755]: E1006 08:24:00.878340 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.980862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.980922 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.980941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.980972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:00 crc kubenswrapper[4755]: I1006 08:24:00.980987 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:00Z","lastTransitionTime":"2025-10-06T08:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.084616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.084671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.084681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.084700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.084712 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.187195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.187235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.187245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.187260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.187272 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.290346 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.290408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.290422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.290442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.290455 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.394067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.394115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.394124 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.394142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.394152 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.466487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.466543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.466555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.466597 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.466611 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.489268 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:01Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.495778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.496010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.496046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.496186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.496367 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.519112 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:01Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.524757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.524839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.524856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.524880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.524921 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.540440 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:01Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.544923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.544974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.544987 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.545008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.545026 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.561452 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:01Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.566203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.566254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.566270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.566296 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.566314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.583433 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:01Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:01 crc kubenswrapper[4755]: E1006 08:24:01.583651 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.585852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.585891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.585904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.585927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.585943 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.689288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.689339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.689350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.689368 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.689378 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.791969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.792030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.792044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.792063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.792075 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.895311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.895360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.895371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.895391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.895404 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.999447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.999512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:01 crc kubenswrapper[4755]: I1006 08:24:01.999525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:01.999554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:01.999611 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:01Z","lastTransitionTime":"2025-10-06T08:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.103061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.103133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.103159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.103197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.103221 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.206074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.206126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.206138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.206156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.206168 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.310122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.310186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.310202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.310255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.310272 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.413688 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.413759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.413778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.413810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.413829 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.429657 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.429885 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.430009 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs podName:60fbd235-a60f-436e-9552-e3eaf60f24f3 nodeName:}" failed. No retries permitted until 2025-10-06 08:25:06.429970637 +0000 UTC m=+163.259285891 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs") pod "network-metrics-daemon-vf9ht" (UID: "60fbd235-a60f-436e-9552-e3eaf60f24f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.518316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.518396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.518440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.518484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.518512 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.621397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.621444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.621454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.621472 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.621483 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.723839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.723891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.723902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.723929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.723946 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.827370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.827451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.827474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.827512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.827538 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.878173 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.878349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.878384 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.878455 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.878607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.878720 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.878826 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:02 crc kubenswrapper[4755]: E1006 08:24:02.878898 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.931070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.931125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.931138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.931161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:02 crc kubenswrapper[4755]: I1006 08:24:02.931175 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:02Z","lastTransitionTime":"2025-10-06T08:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.040436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.040516 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.040529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.040549 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.040579 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.143526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.143691 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.143719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.143758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.143785 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.247380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.247470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.247492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.247518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.247536 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.351969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.352043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.352061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.352088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.352102 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.458225 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.458284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.458299 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.458320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.458333 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.561524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.561624 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.561638 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.561658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.561670 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.665844 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.665912 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.665931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.665959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.665979 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.769883 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.770007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.770029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.770061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.770081 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.873181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.873284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.873696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.873747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.873768 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.895852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"256c354e-2485-4707-9f6b-bf9ca8c178d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af26ae6bcc459bdffb5b3d349c864e2bf5a8c9fdcebbf05a57b081788fb044f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5a92d9b20a4ef845d9eb869c33e525fc0325261b2d9041cb8b2a9b8097cc2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5a92d9b20a4ef845d9eb869c33e525fc0325261b2d9041cb8b2a9b8097cc2e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.913354 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f8efa2b-e966-4987-9fd2-222d159f2123\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcefe050e664d6c4ecced626143d7fdc2de9fcdebf1ea3252dabc4a04218ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87860e6e9a8393d4ea3db98402e769520a1a333916a4a22aa1f018f5d8544757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ac2fc71c43a22fb0c37426cd9704ea237e08579a303b60f0da7764cb0ee95a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7110eb24689b0ca919d6b5abea298781ced6316b3de69744e3f96d6e0e04bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4e91050500727c7709bf3634ac0c059d31800bb4b9b3af24a90c8acd10b76b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1006 08:22:42.952859 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 08:22:42.953246 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1777789502/tls.crt::/tmp/serving-cert-1777789502/tls.key\\\\\\\"\\\\nI1006 08:22:42.953392 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 08:22:42.953805 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953860 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1006 08:22:42.953900 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953914 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 08:22:42.953934 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1006 08:22:42.953950 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1006 08:22:42.954059 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 08:22:42.954118 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 08:22:42.962885 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963069 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 08:22:42.963185 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 08:22:42.965987 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://736a353d03f9e1566153b601a079c4107ccf2258e4e00252bba8a17d66142a02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://636952783f692eaf3330de5cfc68294f6ed6b0b136cb313c84915780b6d4ac31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.929640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2098c5a7-c6dc-4f6f-9dce-0f403c52d577\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be102700835f15709c8861e6c6352d682cfa8ba0a8b1b99f3b4be9be1f26e792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b233a2a8ce984815462f36a15d605edd8c2a739be4cccee6e290603337796a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb27c25d072dc6d65140e2168008f0bb7e6e26b550f0795255e413b30ea816a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf220f7603318b5dd2efb56d9bf12d787d9ffec014ba200b55bdc54d94c4e8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.947398 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcbaead363b8bf9bee69c4d3ca390678adb5c75b05d203dafe8aa8e4059d5910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.960652 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jxm75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ff8aa79-3b9f-472a-9a36-0e92cbf9e6f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4cacbaa7ee99c1d105108940ede8cb6ccdc896ecd979edf5ab622b28849de64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzb9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jxm75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977194 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsg89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b19d445e-b55b-46be-ab4f-ad2d72a966b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ced70e2884143c3e6f2ab35bed2be1d3c21137e454d53feaeaca101b360069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e88485814b88a75187e23a609cee5bb46a1c689412a8c5f8d13be06c6eb876e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a4f098de07147409a51e8eb48d29960fcf07bdbc163876ac477a331e8e322f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55d392130d4bcc626f4063795e87261abe6c465e727bdb85c74d7a641d20b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1834285923e8c0863d008d403c00898ee862c239d0e3e36bc2edb9b98447910\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9caf9a6c04e42f89d365e4c4aa6ae8040a785458d5ab2230c2c8d2b226bddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45fc6b912f29cff382e054ea2cb36bde2984280fd52dc4de630c704ea15e5734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt4kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsg89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.977265 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:03Z","lastTransitionTime":"2025-10-06T08:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:03 crc kubenswrapper[4755]: I1006 08:24:03.990271 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5680a0f34387e2682162e3b6ff5665bf8c65ed25eafc623436fe795232df8952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prjlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfqsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.003173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfe4c263-9750-4b65-b308-b998f3fa1eae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f25f4bdeff027f1dc03ac92edd456c0c6630611b3c569437a0895407405e079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79909fb6aa4c21171a7e5ca4677bfd840bf25180e3310df04661a162a0a567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrxvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6m7xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.018767 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.040239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r96nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"891dff9a-4752-4022-83fc-51f626c76991\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:31Z\\\",\\\"message\\\":\\\"2025-10-06T08:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8\\\\n2025-10-06T08:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7f19ba8b-92d4-438a-a7d4-34203e69e3b8 to /host/opt/cni/bin/\\\\n2025-10-06T08:22:46Z [verbose] multus-daemon started\\\\n2025-10-06T08:22:46Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcggh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r96nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.062426 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b431db-f56c-43e6-9f53-fbc28b857422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:23:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI1006 08:23:38.861216 6733 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861627 6733 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:23:38.861886 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 08:23:38.861949 6733 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 08:23:38.861963 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:23:38.861980 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:23:38.861991 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:23:38.862018 6733 factory.go:656] Stopping watch factory\\\\nI1006 08:23:38.862044 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:23:38.862055 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:23:38.862069 6733 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:23:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22sj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r8qq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.074232 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mh26r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aab0aad-4968-4984-92fe-b4920f08da9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45561146715b7e87cb3f542c155a951ffdb4db9fa65d37bf914f5cf0b6a5f9c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7fq5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mh26r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.080471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.080495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.080505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.080522 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.080534 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.085575 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60fbd235-a60f-436e-9552-e3eaf60f24f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm9nn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vf9ht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.105827 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f664a4a-56c9-4b63-9bea-99bda7a8ea99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed3e5139f6dd1396af8269716dfe8e820c5cab29ea77d5951fe97d8197c0d677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35e75090826c4a696caa3602cad1b4f47cea5ba7c0ec3355bcc2d4235302cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5159d090698bbe26fd94134c1e837d1f9459c6d5f11abdee97b3566bbfd87e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfcb457aa60a47ff5c8f41a80c7ebd182c6d37085e1a7e0d7276de38293b0c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bec8676cd5d6bc7c4ba3584a504c56347826e0e5b59d01a4f05bcb8c983233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04923e6d05f2e95c35fed770f3f1bbc77444559c945c76def46badd19e872f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fe7a2b2e3f99eb824df61e69b71fd7c099461bb74229fff3f5d03d21994a762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b766567c616459d3c117c95bc3e229d003048a80e6afaedc1044d078985476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.120288 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba582c30-5753-4c4d-99d9-ad31cd59ec1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93874dc90338ebd50d41428b77b4e2dd449e76144dd24496e5a600b34d0493c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9ef9720e2410a56e4c7545511fb13d9bd68254cf0072d9dc6afb84de237a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7baf461ac5121358231a5700611f38875e26386b1fe59a2b49ae3b2d976fe083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa8073772761ca621540d3cf7ef45e46306899896944211e0967474536258292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:22:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:22:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:22:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.135491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.151147 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a475859c837fd92c28798833690e7aae463680a19138c3bf8ddc7400550ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.171367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceeb6975437831797b63e41bbb5c6227169a728a6eddb154ab3fd7d2a4d33cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6697d1cebe4a6f9117d894fa2996daf6e75d8b14acb26bf4c13b10402c5bc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.184197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.184269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.184281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.184305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.184319 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.188357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.287501 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.287580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.287601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.287626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.287645 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.391122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.391189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.391214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.391245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.391264 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.495284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.495361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.495380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.495407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.495431 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.598780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.598831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.598841 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.598856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.598866 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.701637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.701694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.701716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.701736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.701751 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.804810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.804894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.804912 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.804941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.804964 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.878485 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.878664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.878485 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:04 crc kubenswrapper[4755]: E1006 08:24:04.878691 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.878522 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:04 crc kubenswrapper[4755]: E1006 08:24:04.878935 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:04 crc kubenswrapper[4755]: E1006 08:24:04.878986 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:04 crc kubenswrapper[4755]: E1006 08:24:04.879212 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.907940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.908067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.908144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.908185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:04 crc kubenswrapper[4755]: I1006 08:24:04.908212 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:04Z","lastTransitionTime":"2025-10-06T08:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.012337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.012409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.012427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.012453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.012471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.115655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.115697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.115706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.115721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.115735 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.218741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.218795 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.218809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.218827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.218840 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.321163 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.321222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.321238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.321259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.321277 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.424864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.424914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.424927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.424946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.424959 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.528322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.528403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.528420 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.528447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.528479 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.632027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.632132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.632160 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.632191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.632210 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.735263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.735402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.735431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.735460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.735481 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.839143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.839185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.839196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.839213 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.839226 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.941906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.941987 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.942001 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.942022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:05 crc kubenswrapper[4755]: I1006 08:24:05.942036 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:05Z","lastTransitionTime":"2025-10-06T08:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.044773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.044850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.044869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.044904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.044927 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.148702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.148763 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.148780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.148804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.148821 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.252178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.252266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.252293 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.252324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.252348 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.355636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.355723 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.355748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.355782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.355805 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.459054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.459115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.459132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.459154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.459169 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.562414 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.562540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.562600 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.562624 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.562639 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.666226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.666353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.666381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.666416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.666439 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.769102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.769173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.769196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.769229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.769252 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.872717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.872767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.872779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.872798 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.872826 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.878361 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.878489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.878364 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.878511 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:06 crc kubenswrapper[4755]: E1006 08:24:06.878692 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:06 crc kubenswrapper[4755]: E1006 08:24:06.878612 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:06 crc kubenswrapper[4755]: E1006 08:24:06.878796 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:06 crc kubenswrapper[4755]: E1006 08:24:06.878890 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.976319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.976377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.976434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.976465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:06 crc kubenswrapper[4755]: I1006 08:24:06.976486 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:06Z","lastTransitionTime":"2025-10-06T08:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.079113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.079332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.079425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.079454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.079471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.182906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.182978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.183001 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.183028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.183048 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.286938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.287005 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.287029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.287060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.287083 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.389958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.390032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.390051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.390502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.390799 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.493303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.493361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.493373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.493389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.493509 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.596264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.596334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.596352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.596376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.596393 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.699060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.699139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.699162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.699192 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.699245 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.802247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.802317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.802341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.802370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.802390 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.878778 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:24:07 crc kubenswrapper[4755]: E1006 08:24:07.878989 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r8qq9_openshift-ovn-kubernetes(b0b431db-f56c-43e6-9f53-fbc28b857422)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.910098 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.910143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.910162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.910178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:07 crc kubenswrapper[4755]: I1006 08:24:07.910191 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:07Z","lastTransitionTime":"2025-10-06T08:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.013148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.013204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.013218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.013236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.013250 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.116846 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.116894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.116908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.116929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.116943 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.219852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.219914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.219929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.219951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.219969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.322538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.322619 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.322630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.322646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.322658 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.425999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.426070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.426087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.426112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.426129 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.528672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.528733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.528757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.528792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.528816 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.632040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.632091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.632108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.632130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.632147 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.734503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.734552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.734606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.734638 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.734658 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.836819 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.836850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.836859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.836872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.836881 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.878104 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:08 crc kubenswrapper[4755]: E1006 08:24:08.878226 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.878384 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:08 crc kubenswrapper[4755]: E1006 08:24:08.878427 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.878524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:08 crc kubenswrapper[4755]: E1006 08:24:08.878594 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.878692 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:08 crc kubenswrapper[4755]: E1006 08:24:08.878751 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.940403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.940534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.940652 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.940729 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:08 crc kubenswrapper[4755]: I1006 08:24:08.940750 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:08Z","lastTransitionTime":"2025-10-06T08:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.043045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.043118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.043141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.043173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.043198 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.145518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.145598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.145613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.145633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.145647 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.248345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.248401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.248414 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.248433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.248445 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.351149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.351215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.351234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.351264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.351282 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.454258 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.454409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.454451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.454485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.454507 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.556981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.557033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.557044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.557061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.557073 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.660193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.660254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.660269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.660290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.660311 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.763532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.763632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.763727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.763760 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.763783 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.866390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.866525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.866547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.866637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.866724 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.971022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.971076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.971085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.971108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:09 crc kubenswrapper[4755]: I1006 08:24:09.971119 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:09Z","lastTransitionTime":"2025-10-06T08:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.075235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.075308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.075326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.075351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.075372 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.178811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.178878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.178915 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.178962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.178989 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.282327 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.282383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.282404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.282427 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.282444 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.385687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.385733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.385744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.385761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.385778 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.493387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.493471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.493818 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.493845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.493867 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.595632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.595683 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.595696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.595713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.595726 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.698732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.698782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.698793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.698811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.698823 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.802124 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.802183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.802193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.802212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.802227 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.877973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.878024 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:10 crc kubenswrapper[4755]: E1006 08:24:10.878172 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.878362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:10 crc kubenswrapper[4755]: E1006 08:24:10.878464 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.878477 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:10 crc kubenswrapper[4755]: E1006 08:24:10.878716 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:10 crc kubenswrapper[4755]: E1006 08:24:10.878768 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.905666 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.905938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.905968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.906005 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:10 crc kubenswrapper[4755]: I1006 08:24:10.906030 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:10Z","lastTransitionTime":"2025-10-06T08:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.009912 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.009989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.010009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.010039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.010062 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.112658 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.112720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.112732 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.112753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.112767 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.216166 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.216240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.216259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.216287 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.216307 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.319228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.319288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.319302 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.319321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.319337 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.422190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.422251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.422263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.422281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.422294 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.524621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.524709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.524733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.524765 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.524897 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.628668 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.628744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.628764 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.628785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.628802 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.704252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.704298 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.704310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.704327 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.704338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.716112 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.719974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.720011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.720021 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.720038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.720047 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.731477 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.735204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.735240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.735252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.735269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.735290 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.746818 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.750844 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.750889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.750899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.750919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.750933 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.766837 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.770697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.770742 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.770751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.770766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.770778 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.782350 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"699772fe-1bda-4c36-8c0f-3619ae33584c\\\",\\\"systemUUID\\\":\\\"ec918f86-fe57-44c4-9b07-fa73cce83870\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:24:11Z is after 2025-08-24T17:21:41Z" Oct 06 08:24:11 crc kubenswrapper[4755]: E1006 08:24:11.782532 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.784413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.784506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.784525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.784545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.784583 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.886360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.886403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.886420 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.886441 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.886457 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.989781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.989847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.989863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.989887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:11 crc kubenswrapper[4755]: I1006 08:24:11.989902 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:11Z","lastTransitionTime":"2025-10-06T08:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.092610 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.092657 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.092669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.092686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.092698 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.195755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.195836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.195847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.195861 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.195870 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.299310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.299679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.299788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.299925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.300022 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.403233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.403264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.403275 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.403292 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.403304 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.506886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.506955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.506974 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.506998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.507015 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.609515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.609591 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.609604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.609621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.609633 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.712418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.712471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.712485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.712502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.712514 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.815733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.815865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.815926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.815958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.816026 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.878786 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.878834 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.879007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.879328 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:12 crc kubenswrapper[4755]: E1006 08:24:12.879556 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:12 crc kubenswrapper[4755]: E1006 08:24:12.879910 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:12 crc kubenswrapper[4755]: E1006 08:24:12.880045 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:12 crc kubenswrapper[4755]: E1006 08:24:12.880075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.920358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.920426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.920448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.920477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:12 crc kubenswrapper[4755]: I1006 08:24:12.920529 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:12Z","lastTransitionTime":"2025-10-06T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.023481 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.024025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.024041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.024062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.024076 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.127606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.127678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.127695 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.127720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.127738 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.231133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.231189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.231207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.231227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.231242 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.334486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.334547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.334556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.334583 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.334593 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.437257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.437345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.437382 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.437413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.437436 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.540437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.540512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.540531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.540558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.540611 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.643108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.643181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.643205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.643237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.643260 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.746459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.746537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.746559 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.746623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.746648 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.856153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.856219 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.856244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.856275 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.856299 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.959376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.959417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.959429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.959443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.959455 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:13Z","lastTransitionTime":"2025-10-06T08:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.967294 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r96nx" podStartSLOduration=90.967271273 podStartE2EDuration="1m30.967271273s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:13.933492956 +0000 UTC m=+110.762808200" watchObservedRunningTime="2025-10-06 08:24:13.967271273 +0000 UTC m=+110.796586497" Oct 06 08:24:13 crc kubenswrapper[4755]: I1006 08:24:13.991116 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mh26r" podStartSLOduration=90.991097068 podStartE2EDuration="1m30.991097068s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:13.979354941 +0000 UTC m=+110.808670165" watchObservedRunningTime="2025-10-06 08:24:13.991097068 +0000 UTC m=+110.820412282" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.015793 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.015773255 podStartE2EDuration="1m27.015773255s" podCreationTimestamp="2025-10-06 08:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.014858531 +0000 UTC m=+110.844173765" watchObservedRunningTime="2025-10-06 08:24:14.015773255 +0000 UTC m=+110.845088469" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.043229 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=61.043211491 podStartE2EDuration="1m1.043211491s" podCreationTimestamp="2025-10-06 08:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.042422241 +0000 UTC m=+110.871737455" watchObservedRunningTime="2025-10-06 08:24:14.043211491 +0000 UTC m=+110.872526695" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.063612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.063664 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.063673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.063687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.063699 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.132401 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6m7xn" podStartSLOduration=91.132374925 podStartE2EDuration="1m31.132374925s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.131478422 +0000 UTC m=+110.960793636" watchObservedRunningTime="2025-10-06 08:24:14.132374925 +0000 UTC m=+110.961690139" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.160958 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.160927869 podStartE2EDuration="18.160927869s" podCreationTimestamp="2025-10-06 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.144556104 +0000 UTC m=+110.973871318" watchObservedRunningTime="2025-10-06 08:24:14.160927869 +0000 UTC m=+110.990243083" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.161477 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.161470564 podStartE2EDuration="1m31.161470564s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.160603391 +0000 UTC m=+110.989918605" watchObservedRunningTime="2025-10-06 08:24:14.161470564 +0000 UTC m=+110.990785778" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.167204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.167256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.167268 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.167326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.167356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.178313 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=91.17828803 podStartE2EDuration="1m31.17828803s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.17667604 +0000 UTC m=+111.005991274" watchObservedRunningTime="2025-10-06 08:24:14.17828803 +0000 UTC m=+111.007603244" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.206415 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jxm75" podStartSLOduration=91.206396124 podStartE2EDuration="1m31.206396124s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.206105307 +0000 UTC m=+111.035420531" watchObservedRunningTime="2025-10-06 08:24:14.206396124 +0000 UTC m=+111.035711338" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.223853 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xsg89" podStartSLOduration=91.223829666 podStartE2EDuration="1m31.223829666s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.223248302 +0000 UTC m=+111.052563516" watchObservedRunningTime="2025-10-06 08:24:14.223829666 +0000 UTC m=+111.053144870" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.237220 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podStartSLOduration=91.237196916 podStartE2EDuration="1m31.237196916s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:14.235813571 +0000 UTC m=+111.065128785" watchObservedRunningTime="2025-10-06 08:24:14.237196916 +0000 UTC m=+111.066512130" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.270049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.270111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.270120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.270141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.270155 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.374249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.374310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.374324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.374346 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.374361 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.478198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.478247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.478257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.478274 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.478285 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.581999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.582050 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.582062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.582085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.582100 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.686468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.686536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.686552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.686606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.686623 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.789850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.789917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.789936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.789964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.789983 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.878918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.879037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.879039 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.879189 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:14 crc kubenswrapper[4755]: E1006 08:24:14.879351 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:14 crc kubenswrapper[4755]: E1006 08:24:14.879443 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:14 crc kubenswrapper[4755]: E1006 08:24:14.879539 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:14 crc kubenswrapper[4755]: E1006 08:24:14.879662 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.894593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.894697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.894721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.894749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.894771 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.998272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.998341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.998366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.998401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:14 crc kubenswrapper[4755]: I1006 08:24:14.998429 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:14Z","lastTransitionTime":"2025-10-06T08:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.100407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.100520 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.100538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.100563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.100617 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.206101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.206202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.206233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.206268 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.206336 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.309864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.309908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.309920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.309935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.309946 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.412791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.412864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.412886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.412946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.412963 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.515945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.516020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.516032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.516050 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.516064 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.618604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.618645 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.618657 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.618677 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.618691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.722004 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.722082 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.722125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.722154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.722169 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.825393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.825435 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.825445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.825464 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.825476 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.928338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.928386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.928406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.928428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:15 crc kubenswrapper[4755]: I1006 08:24:15.928446 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:15Z","lastTransitionTime":"2025-10-06T08:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.031303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.031458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.031481 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.031557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.031640 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.135402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.135467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.135485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.135509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.135526 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.238328 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.238384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.238393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.238411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.238424 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.341057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.341114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.341126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.341151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.341163 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.476447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.476494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.476505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.476522 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.476537 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.579057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.579100 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.579110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.579126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.579140 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.683854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.683957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.683978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.684009 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.684031 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.788803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.788875 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.788887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.788910 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.788924 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.878842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.878930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.879015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.878954 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:16 crc kubenswrapper[4755]: E1006 08:24:16.879127 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:16 crc kubenswrapper[4755]: E1006 08:24:16.879263 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:16 crc kubenswrapper[4755]: E1006 08:24:16.879423 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:16 crc kubenswrapper[4755]: E1006 08:24:16.879545 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.892673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.892734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.892765 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.892796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.892816 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.996847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.996898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.996914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.996939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:16 crc kubenswrapper[4755]: I1006 08:24:16.996958 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:16Z","lastTransitionTime":"2025-10-06T08:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.099807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.099877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.099894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.099921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.099948 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.203470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.203621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.203647 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.203676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.203699 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.307078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.307158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.307178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.307211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.307233 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.410945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.411038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.411060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.411091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.411112 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.514507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.514604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.514625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.514662 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.514686 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.613656 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/1.log" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.614644 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/0.log" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.614736 4755 generic.go:334] "Generic (PLEG): container finished" podID="891dff9a-4752-4022-83fc-51f626c76991" containerID="252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c" exitCode=1 Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.614825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerDied","Data":"252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.615006 4755 scope.go:117] "RemoveContainer" containerID="316dc05b7755a3366beb19f72444c830d9efa3f703a955d63f27cf1aafffdaaa" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.615602 4755 scope.go:117] "RemoveContainer" containerID="252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c" Oct 06 08:24:17 crc kubenswrapper[4755]: E1006 08:24:17.616295 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r96nx_openshift-multus(891dff9a-4752-4022-83fc-51f626c76991)\"" pod="openshift-multus/multus-r96nx" podUID="891dff9a-4752-4022-83fc-51f626c76991" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.620530 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.620646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.620672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.620713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.620735 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.725077 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.725147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.725168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.725200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.725223 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.828494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.828553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.828612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.828640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.828661 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.932535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.932672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.932698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.932737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:17 crc kubenswrapper[4755]: I1006 08:24:17.932764 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:17Z","lastTransitionTime":"2025-10-06T08:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.035892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.035968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.035986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.036014 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.036033 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.139247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.139300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.139317 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.139342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.139360 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.243104 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.243178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.243203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.243238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.243261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.347670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.348182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.348247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.348282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.348337 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.451862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.451918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.451934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.451964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.451982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.561265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.561383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.561408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.561436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.561454 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.623428 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/1.log" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.666494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.666641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.666670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.666706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.666731 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.770452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.770923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.771150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.771340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.771534 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.874840 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.875373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.875611 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.875834 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.875999 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.878341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.878364 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.878343 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:18 crc kubenswrapper[4755]: E1006 08:24:18.878491 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:18 crc kubenswrapper[4755]: E1006 08:24:18.878648 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:18 crc kubenswrapper[4755]: E1006 08:24:18.878738 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.878788 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:18 crc kubenswrapper[4755]: E1006 08:24:18.879407 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.980140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.980216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.980235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.980264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:18 crc kubenswrapper[4755]: I1006 08:24:18.980284 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:18Z","lastTransitionTime":"2025-10-06T08:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.083708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.083774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.083789 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.083809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.083828 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.187065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.187117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.187127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.187146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.187156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.290544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.290650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.290677 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.290718 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.290752 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.394167 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.394221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.394241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.394266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.394283 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.497805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.497886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.497908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.497938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.497967 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.603790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.603881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.603904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.603946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.603970 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.708181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.708244 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.708261 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.708284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.708300 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.812037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.812114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.812133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.812161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.812180 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.914982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.915044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.915062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.915147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:19 crc kubenswrapper[4755]: I1006 08:24:19.915181 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:19Z","lastTransitionTime":"2025-10-06T08:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.019013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.019126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.019137 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.019179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.019192 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.122828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.122913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.122934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.122961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.122981 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.227091 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.227170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.227182 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.227200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.227213 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.331119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.331210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.331224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.331243 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.331257 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.435044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.435117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.435143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.435181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.435207 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.539004 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.539085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.539105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.539130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.539152 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.642440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.642500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.642518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.642543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.642593 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.745698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.745755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.745775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.745797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.745814 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.850910 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.850990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.851016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.851047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.851068 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.877790 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.877824 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.877861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.877791 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:20 crc kubenswrapper[4755]: E1006 08:24:20.877986 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:20 crc kubenswrapper[4755]: E1006 08:24:20.878114 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:20 crc kubenswrapper[4755]: E1006 08:24:20.878254 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:20 crc kubenswrapper[4755]: E1006 08:24:20.878402 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.954587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.954673 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.954686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.954707 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:20 crc kubenswrapper[4755]: I1006 08:24:20.954723 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:20Z","lastTransitionTime":"2025-10-06T08:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.058118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.058171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.058180 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.058199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.058210 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.161802 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.161920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.161939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.162002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.162022 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.266169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.266253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.266271 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.266297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.266312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.370120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.370167 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.370178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.370197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.370210 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.473736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.473798 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.473811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.473831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.473843 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.577758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.577825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.577837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.577858 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.577871 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.681171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.681265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.681281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.681301 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.681312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.785115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.785217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.785242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.785278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.785310 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.879945 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.887506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.887551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.887580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.887601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.887644 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.990453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.990506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.990519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.990543 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:21 crc kubenswrapper[4755]: I1006 08:24:21.990587 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:21Z","lastTransitionTime":"2025-10-06T08:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.093341 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.093379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.093387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.093404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.093414 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:22Z","lastTransitionTime":"2025-10-06T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.181631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.181724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.181749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.181787 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.181853 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:24:22Z","lastTransitionTime":"2025-10-06T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.244704 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff"] Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.245177 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.250016 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.250053 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.250273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.250350 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.355800 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.355882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.356027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2fa680-e270-46e8-a016-a226ff2f9ef3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.356280 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2fa680-e270-46e8-a016-a226ff2f9ef3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.356337 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2fa680-e270-46e8-a016-a226ff2f9ef3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457716 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2fa680-e270-46e8-a016-a226ff2f9ef3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2fa680-e270-46e8-a016-a226ff2f9ef3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2fa680-e270-46e8-a016-a226ff2f9ef3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.457955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.458008 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a2fa680-e270-46e8-a016-a226ff2f9ef3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.460401 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2fa680-e270-46e8-a016-a226ff2f9ef3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.465626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2fa680-e270-46e8-a016-a226ff2f9ef3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.476101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2fa680-e270-46e8-a016-a226ff2f9ef3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-48sff\" (UID: \"4a2fa680-e270-46e8-a016-a226ff2f9ef3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.566829 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" Oct 06 08:24:22 crc kubenswrapper[4755]: W1006 08:24:22.583447 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2fa680_e270_46e8_a016_a226ff2f9ef3.slice/crio-95ad2a64b365c4f74e6b52f1d7e918700c341c3d261ae5dc568f756d0f2eebf7 WatchSource:0}: Error finding container 95ad2a64b365c4f74e6b52f1d7e918700c341c3d261ae5dc568f756d0f2eebf7: Status 404 returned error can't find the container with id 95ad2a64b365c4f74e6b52f1d7e918700c341c3d261ae5dc568f756d0f2eebf7 Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.645196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" event={"ID":"4a2fa680-e270-46e8-a016-a226ff2f9ef3","Type":"ContainerStarted","Data":"95ad2a64b365c4f74e6b52f1d7e918700c341c3d261ae5dc568f756d0f2eebf7"} Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.648799 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/3.log" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.651719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerStarted","Data":"cb1b1c2195b9c9b6379198f3a3261db7589467cdce5907a8d6e27d4c77ba7723"} Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.653417 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.682982 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podStartSLOduration=99.682959034 podStartE2EDuration="1m39.682959034s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:22.682100592 +0000 UTC m=+119.511415836" watchObservedRunningTime="2025-10-06 08:24:22.682959034 +0000 UTC m=+119.512274248" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.838511 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vf9ht"] Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.838747 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:22 crc kubenswrapper[4755]: E1006 08:24:22.838903 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.878352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.878422 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:22 crc kubenswrapper[4755]: I1006 08:24:22.878419 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:22 crc kubenswrapper[4755]: E1006 08:24:22.878560 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:22 crc kubenswrapper[4755]: E1006 08:24:22.878724 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:22 crc kubenswrapper[4755]: E1006 08:24:22.878795 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:23 crc kubenswrapper[4755]: I1006 08:24:23.655914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" event={"ID":"4a2fa680-e270-46e8-a016-a226ff2f9ef3","Type":"ContainerStarted","Data":"cbe7e15564f0af28709de90af8cb1b072402e1a3f7658933f155f8160bd28243"} Oct 06 08:24:23 crc kubenswrapper[4755]: I1006 08:24:23.677443 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-48sff" podStartSLOduration=100.677402598 podStartE2EDuration="1m40.677402598s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:23.675439329 +0000 UTC m=+120.504754583" watchObservedRunningTime="2025-10-06 08:24:23.677402598 +0000 UTC m=+120.506717822" Oct 06 08:24:23 crc kubenswrapper[4755]: E1006 08:24:23.855731 4755 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 08:24:24 crc kubenswrapper[4755]: E1006 08:24:24.019826 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:24:24 crc kubenswrapper[4755]: I1006 08:24:24.878524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:24 crc kubenswrapper[4755]: I1006 08:24:24.878525 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:24 crc kubenswrapper[4755]: I1006 08:24:24.878558 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:24 crc kubenswrapper[4755]: I1006 08:24:24.878645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:24 crc kubenswrapper[4755]: E1006 08:24:24.879146 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:24 crc kubenswrapper[4755]: E1006 08:24:24.879446 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:24 crc kubenswrapper[4755]: E1006 08:24:24.879735 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:24 crc kubenswrapper[4755]: E1006 08:24:24.879890 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:26 crc kubenswrapper[4755]: I1006 08:24:26.878684 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:26 crc kubenswrapper[4755]: I1006 08:24:26.878751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:26 crc kubenswrapper[4755]: I1006 08:24:26.878822 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:26 crc kubenswrapper[4755]: E1006 08:24:26.879123 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:26 crc kubenswrapper[4755]: I1006 08:24:26.879146 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:26 crc kubenswrapper[4755]: E1006 08:24:26.879729 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:26 crc kubenswrapper[4755]: E1006 08:24:26.880004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:26 crc kubenswrapper[4755]: E1006 08:24:26.880128 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:28 crc kubenswrapper[4755]: I1006 08:24:28.877875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:28 crc kubenswrapper[4755]: I1006 08:24:28.877924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:28 crc kubenswrapper[4755]: I1006 08:24:28.877964 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:28 crc kubenswrapper[4755]: I1006 08:24:28.877923 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:28 crc kubenswrapper[4755]: E1006 08:24:28.878210 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:28 crc kubenswrapper[4755]: E1006 08:24:28.878355 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:28 crc kubenswrapper[4755]: E1006 08:24:28.878516 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:28 crc kubenswrapper[4755]: E1006 08:24:28.878627 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:29 crc kubenswrapper[4755]: E1006 08:24:29.022369 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:24:30 crc kubenswrapper[4755]: I1006 08:24:30.878534 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:30 crc kubenswrapper[4755]: I1006 08:24:30.878729 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:30 crc kubenswrapper[4755]: I1006 08:24:30.878765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:30 crc kubenswrapper[4755]: E1006 08:24:30.878879 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:30 crc kubenswrapper[4755]: E1006 08:24:30.878995 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:30 crc kubenswrapper[4755]: I1006 08:24:30.879062 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:30 crc kubenswrapper[4755]: E1006 08:24:30.879192 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:30 crc kubenswrapper[4755]: E1006 08:24:30.879277 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:31 crc kubenswrapper[4755]: I1006 08:24:31.879144 4755 scope.go:117] "RemoveContainer" containerID="252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c" Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.702868 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/1.log" Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.703404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerStarted","Data":"8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51"} Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.878264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.878312 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.878264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:32 crc kubenswrapper[4755]: I1006 08:24:32.878264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:32 crc kubenswrapper[4755]: E1006 08:24:32.878535 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vf9ht" podUID="60fbd235-a60f-436e-9552-e3eaf60f24f3" Oct 06 08:24:32 crc kubenswrapper[4755]: E1006 08:24:32.878852 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:24:32 crc kubenswrapper[4755]: E1006 08:24:32.878947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:24:32 crc kubenswrapper[4755]: E1006 08:24:32.879003 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.878648 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.878664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.878842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.879072 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.882990 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.883510 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.883629 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.884672 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.884778 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 08:24:34 crc kubenswrapper[4755]: I1006 08:24:34.884839 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.654783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.699707 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.700277 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.706539 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2lqpg"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.706910 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.707084 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hztlt"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.707481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.708917 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.709473 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.712523 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.712816 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.713008 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.713836 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.714304 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.714458 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.714486 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.714673 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.714734 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.715152 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.715327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.715480 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.718394 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpgbv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.720074 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.720359 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.721174 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.721263 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.721850 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.723225 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.723742 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5snnf"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.724138 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.724213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725068 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725136 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725309 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725637 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725662 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.725941 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.726094 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.726235 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.731126 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.731436 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.731649 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.731982 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.732155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.732332 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.732506 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.732655 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.733334 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.733932 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.735218 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4wgdb"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.735706 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.741051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.742140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.744786 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.745309 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.746437 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.747144 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.753031 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.753283 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.753759 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.754149 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.754374 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.754549 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.754993 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.755107 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.755917 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756042 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756052 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756225 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756364 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756661 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756264 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.756298 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757283 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757416 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757479 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757732 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757917 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.757929 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758041 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758094 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758167 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758218 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758639 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758721 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758771 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.774642 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758848 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758887 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758899 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.758971 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.759136 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.759168 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.777398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.759191 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.777831 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.777899 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.777994 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.778160 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.759267 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.778408 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.759277 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.772877 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.778963 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.779204 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.779452 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.779520 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.780101 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.780261 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.780608 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.780673 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.791721 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.791802 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.792025 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.794958 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-klxzw"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.795621 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.795660 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.795985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.796092 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.796176 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.797930 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.799524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.804471 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.804780 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.804891 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.805233 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.805507 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.810627 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-encryption-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqshq\" (UniqueName: \"kubernetes.io/projected/7c5c24ec-6be2-4b4c-a321-2559254d8158-kube-api-access-fqshq\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812437 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-auth-proxy-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812454 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-etcd-client\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812473 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-encryption-config\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-node-pullsecrets\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812529 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-serving-cert\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5c24ec-6be2-4b4c-a321-2559254d8158-metrics-tls\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812594 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab502ea-2cb2-4127-a081-d871168af9aa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812615 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg942\" (UniqueName: \"kubernetes.io/projected/4d50d581-684f-48fb-86fa-86339fe67de7-kube-api-access-cg942\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eee21fd7-fd7b-4924-ac33-4e086deb424c-serving-cert\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxkz\" (UniqueName: \"kubernetes.io/projected/f262d5f0-ec94-4668-9a49-47616dd4625f-kube-api-access-rsxkz\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-audit-policies\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-serving-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-audit-dir\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39386f6f-4d16-4a81-9432-e486d9e6ee60-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266418ff-0098-46b7-a0b2-e930a1dfb1d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gql8l\" (UniqueName: \"kubernetes.io/projected/eee21fd7-fd7b-4924-ac33-4e086deb424c-kube-api-access-gql8l\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b128405-242c-41da-9259-9e6fa646e505-audit-dir\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812809 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-images\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812879 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-audit\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdv5\" (UniqueName: \"kubernetes.io/projected/39386f6f-4d16-4a81-9432-e486d9e6ee60-kube-api-access-jqdv5\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812911 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4d50d581-684f-48fb-86fa-86339fe67de7-machine-approver-tls\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812942 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.812990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-client\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fk4\" (UniqueName: \"kubernetes.io/projected/1b128405-242c-41da-9259-9e6fa646e505-kube-api-access-g9fk4\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rk7\" (UniqueName: \"kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-serving-cert\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j4br\" (UniqueName: \"kubernetes.io/projected/266418ff-0098-46b7-a0b2-e930a1dfb1d8-kube-api-access-6j4br\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813087 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-config\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813102 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-image-import-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813120 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfr9\" (UniqueName: \"kubernetes.io/projected/9ab502ea-2cb2-4127-a081-d871168af9aa-kube-api-access-shfr9\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813145 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab502ea-2cb2-4127-a081-d871168af9aa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813186 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813205 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/266418ff-0098-46b7-a0b2-e930a1dfb1d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-config\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.813883 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.814043 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.814083 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.814796 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.815275 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.815628 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.815900 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.816070 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.816532 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.816920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.818166 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.818730 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.835009 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.838322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.861137 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.861385 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.862339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.862673 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.864230 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.864471 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.867008 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.867583 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.868421 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.870011 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2lqpg"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.871548 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4vctk"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.872068 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.874593 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.875011 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.877524 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.878161 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.883397 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fdr74"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.884184 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.884584 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.884912 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.891213 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.892231 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.895973 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zbxjs"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.896178 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.897172 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.903127 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.906347 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.907188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.908147 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.908967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.909239 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfxvn"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.909654 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.910804 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xp227"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.911215 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.911344 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.912386 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.913332 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.913812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-audit\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915268 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdv5\" (UniqueName: \"kubernetes.io/projected/39386f6f-4d16-4a81-9432-e486d9e6ee60-kube-api-access-jqdv5\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915295 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4d50d581-684f-48fb-86fa-86339fe67de7-machine-approver-tls\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rk7\" (UniqueName: \"kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-client\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fk4\" (UniqueName: \"kubernetes.io/projected/1b128405-242c-41da-9259-9e6fa646e505-kube-api-access-g9fk4\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-serving-cert\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j4br\" (UniqueName: \"kubernetes.io/projected/266418ff-0098-46b7-a0b2-e930a1dfb1d8-kube-api-access-6j4br\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915491 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-config\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-image-import-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915541 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfr9\" (UniqueName: \"kubernetes.io/projected/9ab502ea-2cb2-4127-a081-d871168af9aa-kube-api-access-shfr9\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915589 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/266418ff-0098-46b7-a0b2-e930a1dfb1d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab502ea-2cb2-4127-a081-d871168af9aa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-config\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqshq\" (UniqueName: \"kubernetes.io/projected/7c5c24ec-6be2-4b4c-a321-2559254d8158-kube-api-access-fqshq\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-encryption-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-auth-proxy-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-etcd-client\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-encryption-config\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915823 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-serving-cert\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5c24ec-6be2-4b4c-a321-2559254d8158-metrics-tls\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915882 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-node-pullsecrets\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915925 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg942\" (UniqueName: \"kubernetes.io/projected/4d50d581-684f-48fb-86fa-86339fe67de7-kube-api-access-cg942\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab502ea-2cb2-4127-a081-d871168af9aa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eee21fd7-fd7b-4924-ac33-4e086deb424c-serving-cert\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.915991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxkz\" (UniqueName: \"kubernetes.io/projected/f262d5f0-ec94-4668-9a49-47616dd4625f-kube-api-access-rsxkz\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916009 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-audit-policies\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-serving-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-audit-dir\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39386f6f-4d16-4a81-9432-e486d9e6ee60-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916114 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266418ff-0098-46b7-a0b2-e930a1dfb1d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916134 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gql8l\" (UniqueName: \"kubernetes.io/projected/eee21fd7-fd7b-4924-ac33-4e086deb424c-kube-api-access-gql8l\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b128405-242c-41da-9259-9e6fa646e505-audit-dir\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-images\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916194 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.916213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.917302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.917345 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.917769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-audit\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.918015 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.918322 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5mh6f"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.918761 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.919301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.919827 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.920857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-node-pullsecrets\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.920935 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-config\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.920951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.921476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.922467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-config\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.922713 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.922916 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee21fd7-fd7b-4924-ac33-4e086deb424c-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.922957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.924900 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab502ea-2cb2-4127-a081-d871168af9aa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.925228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.925383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b128405-242c-41da-9259-9e6fa646e505-audit-policies\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.925881 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hj99z"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.925921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-serving-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.925970 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f262d5f0-ec94-4668-9a49-47616dd4625f-audit-dir\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.926102 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.926111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.926753 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d50d581-684f-48fb-86fa-86339fe67de7-auth-proxy-config\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.929350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b128405-242c-41da-9259-9e6fa646e505-audit-dir\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.929718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f262d5f0-ec94-4668-9a49-47616dd4625f-image-import-ca\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.929842 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/266418ff-0098-46b7-a0b2-e930a1dfb1d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.930018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.930517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39386f6f-4d16-4a81-9432-e486d9e6ee60-images\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.931464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-etcd-client\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.931863 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.932062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39386f6f-4d16-4a81-9432-e486d9e6ee60-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.932076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5c24ec-6be2-4b4c-a321-2559254d8158-metrics-tls\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.938977 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266418ff-0098-46b7-a0b2-e930a1dfb1d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.940168 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-encryption-config\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.940422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab502ea-2cb2-4127-a081-d871168af9aa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.944607 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-serving-cert\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.945328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.946435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4d50d581-684f-48fb-86fa-86339fe67de7-machine-approver-tls\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.947412 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5snnf"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.947447 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.947581 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.947684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-etcd-client\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.949246 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b128405-242c-41da-9259-9e6fa646e505-serving-cert\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.953357 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hztlt"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.957308 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f262d5f0-ec94-4668-9a49-47616dd4625f-encryption-config\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.966164 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.966379 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.969227 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.969307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eee21fd7-fd7b-4924-ac33-4e086deb424c-serving-cert\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.970350 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.971447 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.973197 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.974329 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.975490 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.977461 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpgbv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.978825 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4wgdb"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.979886 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-klxzw"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.981640 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hj99z"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.982136 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.982977 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.983170 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.984290 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.985488 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.986601 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.987684 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.989029 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.990219 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.991353 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.993265 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xp227"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.994476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fdr74"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.995713 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wllq2"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.996379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.997435 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cnt4g"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.998114 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.998515 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272"] Oct 06 08:24:42 crc kubenswrapper[4755]: I1006 08:24:42.999872 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.000956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.002105 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.002376 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.004216 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wllq2"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.004239 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.005003 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.006253 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfxvn"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.006897 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4vctk"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.008873 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cnt4g"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.010024 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.011138 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b"] Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016789 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016818 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016856 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016878 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-trusted-ca\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtv7\" (UniqueName: \"kubernetes.io/projected/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-kube-api-access-6dtv7\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.016916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017139 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhd7\" (UniqueName: \"kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017284 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017324 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017382 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thch8\" (UniqueName: \"kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017763 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.017948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018005 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018067 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcgvq\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018373 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-config\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018413 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.018428 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.518412799 +0000 UTC m=+140.347728013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018469 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018490 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-serving-cert\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018550 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018654 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.018721 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.023536 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.043553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.063510 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.082996 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.103252 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.119369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.119662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.119727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.119770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.119870 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.619831572 +0000 UTC m=+140.449146786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.119994 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120385 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-trusted-ca\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtv7\" (UniqueName: \"kubernetes.io/projected/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-kube-api-access-6dtv7\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120465 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120514 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lhw\" (UniqueName: \"kubernetes.io/projected/e2377494-c95e-4c4e-a37b-b2a7edd85fad-kube-api-access-k4lhw\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120581 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120608 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhd7\" (UniqueName: \"kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120656 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrgwd\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-kube-api-access-hrgwd\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120801 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2377494-c95e-4c4e-a37b-b2a7edd85fad-proxy-tls\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120919 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120952 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.120970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thch8\" (UniqueName: \"kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121060 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121380 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.121440 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.621433941 +0000 UTC m=+140.450749145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.121444 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.122784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.123031 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.123240 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.123479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.124219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.125290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.126706 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.127000 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.127006 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.127584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.128029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.128468 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.122932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcgvq\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.128723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.128849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.128979 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-images\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129214 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-config\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129367 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t468d\" (UniqueName: \"kubernetes.io/projected/442d39b2-76cb-4123-a4bc-2dbc8ea62041-kube-api-access-t468d\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129678 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-serving-cert\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.130068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.130239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvbb\" (UniqueName: \"kubernetes.io/projected/84e010ca-d47c-40d6-8b18-d67164e60d0b-kube-api-access-qsvbb\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.130253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.131016 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-config\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129926 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.131325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-trusted-ca\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.129577 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.131661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.132451 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.133854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.134224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.134396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.136626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.136641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-serving-cert\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.137287 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.157082 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.159438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.170401 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.182650 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.203356 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.232311 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.732276038 +0000 UTC m=+140.561591262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232395 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232427 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-images\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232512 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t468d\" (UniqueName: \"kubernetes.io/projected/442d39b2-76cb-4123-a4bc-2dbc8ea62041-kube-api-access-t468d\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvbb\" (UniqueName: \"kubernetes.io/projected/84e010ca-d47c-40d6-8b18-d67164e60d0b-kube-api-access-qsvbb\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lhw\" (UniqueName: \"kubernetes.io/projected/e2377494-c95e-4c4e-a37b-b2a7edd85fad-kube-api-access-k4lhw\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.232979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.233003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrgwd\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-kube-api-access-hrgwd\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.233031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2377494-c95e-4c4e-a37b-b2a7edd85fad-proxy-tls\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.233078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.233115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.233153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.234719 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.734695667 +0000 UTC m=+140.564010881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.235287 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.236483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.238184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.243066 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.262663 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.282964 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.303471 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.323915 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.334029 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.334528 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.83449287 +0000 UTC m=+140.663808114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.334831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.335321 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.83530318 +0000 UTC m=+140.664618424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.343273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.363864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.383243 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.403192 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.422865 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.436548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.436879 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.936840085 +0000 UTC m=+140.766155329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.437600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.438196 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:43.938172019 +0000 UTC m=+140.767487273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.444457 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.463296 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.484388 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.504237 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.523958 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.539197 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.539337 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.039305394 +0000 UTC m=+140.868620648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.539611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.539935 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.039926669 +0000 UTC m=+140.869241883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.544949 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.563465 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.584323 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.604084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.624365 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.640821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.641459 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.141424124 +0000 UTC m=+140.970739388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.643634 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.663729 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.683636 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.685923 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2377494-c95e-4c4e-a37b-b2a7edd85fad-images\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.702373 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.723132 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.728873 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2377494-c95e-4c4e-a37b-b2a7edd85fad-proxy-tls\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.742712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.743449 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.243416921 +0000 UTC m=+141.072732155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.752736 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.763782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.782848 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.803466 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.824313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.844703 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.844690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.845724 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.345657024 +0000 UTC m=+141.174972278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.863601 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.882721 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.901632 4755 request.go:700] Waited for 1.005189272s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.903739 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.922967 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.942950 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.946497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:43 crc kubenswrapper[4755]: E1006 08:24:43.947158 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.447124977 +0000 UTC m=+141.276440231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.963404 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 08:24:43 crc kubenswrapper[4755]: I1006 08:24:43.984465 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.004150 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.025517 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.047498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.048123 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.548034297 +0000 UTC m=+141.377349541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.048285 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.048905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.049350 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.549335409 +0000 UTC m=+141.378650623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.063213 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.084345 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.103468 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.123695 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.143014 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.152621 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.153047 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.653000117 +0000 UTC m=+141.482315361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.153291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.154279 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.654251338 +0000 UTC m=+141.483566562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.163215 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.182924 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.202151 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.223441 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234459 4755 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234613 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token podName:442d39b2-76cb-4123-a4bc-2dbc8ea62041 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.734581228 +0000 UTC m=+141.563896442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token") pod "machine-config-server-5mh6f" (UID: "442d39b2-76cb-4123-a4bc-2dbc8ea62041") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234764 4755 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234877 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs podName:442d39b2-76cb-4123-a4bc-2dbc8ea62041 nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.734843165 +0000 UTC m=+141.564158419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs") pod "machine-config-server-5mh6f" (UID: "442d39b2-76cb-4123-a4bc-2dbc8ea62041") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234885 4755 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234941 4755 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.234996 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume podName:84e010ca-d47c-40d6-8b18-d67164e60d0b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.734967858 +0000 UTC m=+141.564283252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume") pod "dns-default-cnt4g" (UID: "84e010ca-d47c-40d6-8b18-d67164e60d0b") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.235036 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls podName:84e010ca-d47c-40d6-8b18-d67164e60d0b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.735017189 +0000 UTC m=+141.564332633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls") pod "dns-default-cnt4g" (UID: "84e010ca-d47c-40d6-8b18-d67164e60d0b") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.243900 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.254868 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.255013 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.754989204 +0000 UTC m=+141.584304458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.255328 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.255902 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.755884476 +0000 UTC m=+141.585199720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.262958 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.283296 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.303209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.324054 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.344024 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.357216 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.357487 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.857451582 +0000 UTC m=+141.686766796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.358011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.358415 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.858408206 +0000 UTC m=+141.687723420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.363475 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.384229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.403907 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.423135 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.458789 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.459031 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.958990168 +0000 UTC m=+141.788305392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.459513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.460101 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:44.960062934 +0000 UTC m=+141.789378178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.464240 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.466895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdv5\" (UniqueName: \"kubernetes.io/projected/39386f6f-4d16-4a81-9432-e486d9e6ee60-kube-api-access-jqdv5\") pod \"machine-api-operator-5694c8668f-2lqpg\" (UID: \"39386f6f-4d16-4a81-9432-e486d9e6ee60\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.484746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.532292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxkz\" (UniqueName: \"kubernetes.io/projected/f262d5f0-ec94-4668-9a49-47616dd4625f-kube-api-access-rsxkz\") pod \"apiserver-76f77b778f-hztlt\" (UID: \"f262d5f0-ec94-4668-9a49-47616dd4625f\") " pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.552655 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqshq\" (UniqueName: \"kubernetes.io/projected/7c5c24ec-6be2-4b4c-a321-2559254d8158-kube-api-access-fqshq\") pod \"dns-operator-744455d44c-4wgdb\" (UID: \"7c5c24ec-6be2-4b4c-a321-2559254d8158\") " pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.561274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.561610 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.061537568 +0000 UTC m=+141.890852802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.561956 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.563452 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.063435755 +0000 UTC m=+141.892750989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.572599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg942\" (UniqueName: \"kubernetes.io/projected/4d50d581-684f-48fb-86fa-86339fe67de7-kube-api-access-cg942\") pod \"machine-approver-56656f9798-74nwf\" (UID: \"4d50d581-684f-48fb-86fa-86339fe67de7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.583958 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.584400 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rk7\" (UniqueName: \"kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7\") pod \"controller-manager-879f6c89f-4skj5\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.603380 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.623640 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.637031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.664503 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.665664 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.165526714 +0000 UTC m=+141.994841978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.672064 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fk4\" (UniqueName: \"kubernetes.io/projected/1b128405-242c-41da-9259-9e6fa646e505-kube-api-access-g9fk4\") pod \"apiserver-7bbb656c7d-h5272\" (UID: \"1b128405-242c-41da-9259-9e6fa646e505\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.675703 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.681520 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.684463 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.693464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfr9\" (UniqueName: \"kubernetes.io/projected/9ab502ea-2cb2-4127-a081-d871168af9aa-kube-api-access-shfr9\") pod \"openshift-apiserver-operator-796bbdcf4f-bj8wv\" (UID: \"9ab502ea-2cb2-4127-a081-d871168af9aa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.703238 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.706382 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.714336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.724446 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.767248 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.767357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.767406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.767440 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.767471 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.769112 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.269082629 +0000 UTC m=+142.098397883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.772792 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-certs\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.773967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/442d39b2-76cb-4123-a4bc-2dbc8ea62041-node-bootstrap-token\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.785927 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gql8l\" (UniqueName: \"kubernetes.io/projected/eee21fd7-fd7b-4924-ac33-4e086deb424c-kube-api-access-gql8l\") pod \"authentication-operator-69f744f599-hpgbv\" (UID: \"eee21fd7-fd7b-4924-ac33-4e086deb424c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.786668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.792908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j4br\" (UniqueName: \"kubernetes.io/projected/266418ff-0098-46b7-a0b2-e930a1dfb1d8-kube-api-access-6j4br\") pod \"openshift-controller-manager-operator-756b6f6bc6-42p29\" (UID: \"266418ff-0098-46b7-a0b2-e930a1dfb1d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.802114 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.804496 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.825026 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.830158 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.844859 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.863038 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.868545 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.868760 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.368728158 +0000 UTC m=+142.198043372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.868967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.869716 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.369657671 +0000 UTC m=+142.198972885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.885011 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.895824 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.903905 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.921007 4755 request.go:700] Waited for 1.922669702s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.921030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hztlt"] Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.923310 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.929300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84e010ca-d47c-40d6-8b18-d67164e60d0b-config-volume\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.944053 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.955099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84e010ca-d47c-40d6-8b18-d67164e60d0b-metrics-tls\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.964177 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.968170 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2lqpg"] Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.970448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:44 crc kubenswrapper[4755]: E1006 08:24:44.970814 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.470790366 +0000 UTC m=+142.300105570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:44 crc kubenswrapper[4755]: I1006 08:24:44.995939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.030866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thch8\" (UniqueName: \"kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8\") pod \"console-f9d7485db-nrx4l\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.039366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcgvq\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.057238 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.068416 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.072276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.072852 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.572835425 +0000 UTC m=+142.402150649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.085947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhd7\" (UniqueName: \"kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7\") pod \"oauth-openshift-558db77b4-p47k9\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.090673 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.102121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtv7\" (UniqueName: \"kubernetes.io/projected/e61e6c52-261a-4ca9-b4aa-3da462aa4e7f-kube-api-access-6dtv7\") pod \"console-operator-58897d9998-5snnf\" (UID: \"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f\") " pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.106674 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.138263 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4wgdb"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.144727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.146319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.165067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrgwd\" (UniqueName: \"kubernetes.io/projected/5aecf36c-e9bc-41d1-b417-d8c81c91cdbe-kube-api-access-hrgwd\") pod \"cluster-image-registry-operator-dc59b4c8b-tmlcx\" (UID: \"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.173117 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.173254 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.673230102 +0000 UTC m=+142.502545316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.173666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.174026 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.674016381 +0000 UTC m=+142.503331595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.193737 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvbb\" (UniqueName: \"kubernetes.io/projected/84e010ca-d47c-40d6-8b18-d67164e60d0b-kube-api-access-qsvbb\") pod \"dns-default-cnt4g\" (UID: \"84e010ca-d47c-40d6-8b18-d67164e60d0b\") " pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.201250 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lhw\" (UniqueName: \"kubernetes.io/projected/e2377494-c95e-4c4e-a37b-b2a7edd85fad-kube-api-access-k4lhw\") pod \"machine-config-operator-74547568cd-tr47c\" (UID: \"e2377494-c95e-4c4e-a37b-b2a7edd85fad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.208452 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" Oct 06 08:24:45 crc kubenswrapper[4755]: W1006 08:24:45.219277 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b128405_242c_41da_9259_9e6fa646e505.slice/crio-275ba495053562330eba8f791c86132ab90c8f5a74c7b63cf8ddea934ff25855 WatchSource:0}: Error finding container 275ba495053562330eba8f791c86132ab90c8f5a74c7b63cf8ddea934ff25855: Status 404 returned error can't find the container with id 275ba495053562330eba8f791c86132ab90c8f5a74c7b63cf8ddea934ff25855 Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.221869 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t468d\" (UniqueName: \"kubernetes.io/projected/442d39b2-76cb-4123-a4bc-2dbc8ea62041-kube-api-access-t468d\") pod \"machine-config-server-5mh6f\" (UID: \"442d39b2-76cb-4123-a4bc-2dbc8ea62041\") " pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.246777 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.249763 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:24:45 crc kubenswrapper[4755]: W1006 08:24:45.270290 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98cbede_25b7_40d4_b1ad_18e144e46bcc.slice/crio-9b6f0d48858e6a81e4877e59a03930926121df05d6b8d8c67e1e623bb9cd576d WatchSource:0}: Error finding container 9b6f0d48858e6a81e4877e59a03930926121df05d6b8d8c67e1e623bb9cd576d: Status 404 returned error can't find the container with id 9b6f0d48858e6a81e4877e59a03930926121df05d6b8d8c67e1e623bb9cd576d Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.274936 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.275235 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.775090196 +0000 UTC m=+142.604405400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.275499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbd5\" (UniqueName: \"kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.275612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.275654 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-mountpoint-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277208 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277335 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a62c4d-4d44-4b7f-b115-15469a82976e-config\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277422 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqtq\" (UniqueName: \"kubernetes.io/projected/f9fb648b-fea0-448d-a4d3-d967953806c9-kube-api-access-jkqtq\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277476 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jmz\" (UniqueName: \"kubernetes.io/projected/38c940d4-7aae-4661-9c98-aaab303881e5-kube-api-access-48jmz\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277552 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zbsb\" (UniqueName: \"kubernetes.io/projected/ba7bf7ad-087d-4557-8647-02a56285e4c4-kube-api-access-7zbsb\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wm5h\" (UniqueName: \"kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-key\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e65680a-1fd6-41e3-a51a-c5bc7654216f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkkr7\" (UniqueName: \"kubernetes.io/projected/1aee32aa-36a7-4bf9-80ed-4afdc433746a-kube-api-access-wkkr7\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjmh\" (UniqueName: \"kubernetes.io/projected/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-kube-api-access-qvjmh\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fd34fc-beed-483e-bd3c-3eeed8239d05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-metrics-certs\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.277958 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvzg\" (UniqueName: \"kubernetes.io/projected/d67128b0-39d6-49ca-a8f0-337fdb64ef39-kube-api-access-jzvzg\") pod \"migrator-59844c95c7-jsvj5\" (UID: \"d67128b0-39d6-49ca-a8f0-337fdb64ef39\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.281454 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-serving-cert\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.281491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqdd\" (UniqueName: \"kubernetes.io/projected/2e218dd8-5ee2-4355-8304-be35e207d366-kube-api-access-dxqdd\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.282072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.282331 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.782318125 +0000 UTC m=+142.611633339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.282380 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-cabundle\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.282586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c08ae61-3d4a-4905-94aa-88e03148b073-metrics-tls\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.282931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm89v\" (UniqueName: \"kubernetes.io/projected/833f574b-27b1-4b3c-a2d9-2e22d1434926-kube-api-access-wm89v\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.283041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7bf7ad-087d-4557-8647-02a56285e4c4-serving-cert\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-service-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286321 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxhc\" (UniqueName: \"kubernetes.io/projected/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-kube-api-access-2hxhc\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286342 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286381 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-default-certificate\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286478 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264j8\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-kube-api-access-264j8\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rss\" (UniqueName: \"kubernetes.io/projected/62fd34fc-beed-483e-bd3c-3eeed8239d05-kube-api-access-25rss\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c08ae61-3d4a-4905-94aa-88e03148b073-trusted-ca\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286744 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/343af78f-ce0c-4feb-a8d9-38c5a524b342-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/833f574b-27b1-4b3c-a2d9-2e22d1434926-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.286795 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-client\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.287679 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38c940d4-7aae-4661-9c98-aaab303881e5-proxy-tls\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.287862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-registration-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.287908 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x775t\" (UniqueName: \"kubernetes.io/projected/e14368cf-4d62-407f-b4b4-2318df6a6382-kube-api-access-x775t\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.287929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-srv-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.288054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29a62c4d-4d44-4b7f-b115-15469a82976e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.288091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-cert\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.288123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49wf\" (UniqueName: \"kubernetes.io/projected/79747f17-84d8-434a-afdb-c737c276ae90-kube-api-access-k49wf\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.288154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4dc\" (UniqueName: \"kubernetes.io/projected/e47b738a-2656-4f75-8ce7-da45f4e17424-kube-api-access-xk4dc\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290095 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-config\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29a62c4d-4d44-4b7f-b115-15469a82976e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290176 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/776e4025-0157-4c0d-af55-ae80dcf7250d-tmpfs\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290198 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-apiservice-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290219 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-webhook-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290240 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq55b\" (UniqueName: \"kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.290973 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833f574b-27b1-4b3c-a2d9-2e22d1434926-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-stats-auth\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291057 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291083 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38c940d4-7aae-4661-9c98-aaab303881e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7bf7ad-087d-4557-8647-02a56285e4c4-config\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291196 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-socket-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291219 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf6s2\" (UniqueName: \"kubernetes.io/projected/343af78f-ce0c-4feb-a8d9-38c5a524b342-kube-api-access-jf6s2\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291299 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-plugins-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291394 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-csi-data-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-srv-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjkp\" (UniqueName: \"kubernetes.io/projected/776e4025-0157-4c0d-af55-ae80dcf7250d-kube-api-access-zqjkp\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.291895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e14368cf-4d62-407f-b4b4-2318df6a6382-service-ca-bundle\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292424 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79747f17-84d8-434a-afdb-c737c276ae90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292446 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2gl\" (UniqueName: \"kubernetes.io/projected/264bea46-510c-4a6c-ba59-91b0388882de-kube-api-access-cg2gl\") pod \"downloads-7954f5f757-klxzw\" (UID: \"264bea46-510c-4a6c-ba59-91b0388882de\") " pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-config\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg6jc\" (UniqueName: \"kubernetes.io/projected/ce95f03d-ac27-4976-883b-deedb1c4b5ac-kube-api-access-zg6jc\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.292530 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdj2\" (UniqueName: \"kubernetes.io/projected/1e65680a-1fd6-41e3-a51a-c5bc7654216f-kube-api-access-2sdj2\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.311654 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpgbv"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.325071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5mh6f" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.379418 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:45 crc kubenswrapper[4755]: W1006 08:24:45.383051 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee21fd7_fd7b_4924_ac33_4e086deb424c.slice/crio-161cb8ee2f019839b337d0a7ff2291aaf13de172d34ebfce37ceb01d434cc8fd WatchSource:0}: Error finding container 161cb8ee2f019839b337d0a7ff2291aaf13de172d34ebfce37ceb01d434cc8fd: Status 404 returned error can't find the container with id 161cb8ee2f019839b337d0a7ff2291aaf13de172d34ebfce37ceb01d434cc8fd Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.385061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393064 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393243 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38c940d4-7aae-4661-9c98-aaab303881e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393309 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7bf7ad-087d-4557-8647-02a56285e4c4-config\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393329 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-socket-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393349 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf6s2\" (UniqueName: \"kubernetes.io/projected/343af78f-ce0c-4feb-a8d9-38c5a524b342-kube-api-access-jf6s2\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393387 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393403 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-plugins-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-csi-data-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393439 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-srv-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjkp\" (UniqueName: \"kubernetes.io/projected/776e4025-0157-4c0d-af55-ae80dcf7250d-kube-api-access-zqjkp\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393510 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e14368cf-4d62-407f-b4b4-2318df6a6382-service-ca-bundle\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393528 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79747f17-84d8-434a-afdb-c737c276ae90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393548 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2gl\" (UniqueName: \"kubernetes.io/projected/264bea46-510c-4a6c-ba59-91b0388882de-kube-api-access-cg2gl\") pod \"downloads-7954f5f757-klxzw\" (UID: \"264bea46-510c-4a6c-ba59-91b0388882de\") " pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-config\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg6jc\" (UniqueName: \"kubernetes.io/projected/ce95f03d-ac27-4976-883b-deedb1c4b5ac-kube-api-access-zg6jc\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdj2\" (UniqueName: \"kubernetes.io/projected/1e65680a-1fd6-41e3-a51a-c5bc7654216f-kube-api-access-2sdj2\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbd5\" (UniqueName: \"kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393772 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-mountpoint-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393812 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393887 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a62c4d-4d44-4b7f-b115-15469a82976e-config\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393908 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqtq\" (UniqueName: \"kubernetes.io/projected/f9fb648b-fea0-448d-a4d3-d967953806c9-kube-api-access-jkqtq\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jmz\" (UniqueName: \"kubernetes.io/projected/38c940d4-7aae-4661-9c98-aaab303881e5-kube-api-access-48jmz\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zbsb\" (UniqueName: \"kubernetes.io/projected/ba7bf7ad-087d-4557-8647-02a56285e4c4-kube-api-access-7zbsb\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393964 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wm5h\" (UniqueName: \"kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.393983 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-key\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394039 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e65680a-1fd6-41e3-a51a-c5bc7654216f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394665 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkkr7\" (UniqueName: \"kubernetes.io/projected/1aee32aa-36a7-4bf9-80ed-4afdc433746a-kube-api-access-wkkr7\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvjmh\" (UniqueName: \"kubernetes.io/projected/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-kube-api-access-qvjmh\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394711 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fd34fc-beed-483e-bd3c-3eeed8239d05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-metrics-certs\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvzg\" (UniqueName: \"kubernetes.io/projected/d67128b0-39d6-49ca-a8f0-337fdb64ef39-kube-api-access-jzvzg\") pod \"migrator-59844c95c7-jsvj5\" (UID: \"d67128b0-39d6-49ca-a8f0-337fdb64ef39\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-serving-cert\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqdd\" (UniqueName: \"kubernetes.io/projected/2e218dd8-5ee2-4355-8304-be35e207d366-kube-api-access-dxqdd\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-cabundle\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394848 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c08ae61-3d4a-4905-94aa-88e03148b073-metrics-tls\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm89v\" (UniqueName: \"kubernetes.io/projected/833f574b-27b1-4b3c-a2d9-2e22d1434926-kube-api-access-wm89v\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394899 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7bf7ad-087d-4557-8647-02a56285e4c4-serving-cert\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394915 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-service-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394949 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxhc\" (UniqueName: \"kubernetes.io/projected/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-kube-api-access-2hxhc\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394956 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.394965 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.395070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-default-certificate\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.395129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-264j8\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-kube-api-access-264j8\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.395161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.395180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397006 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.397097 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:45.897076368 +0000 UTC m=+142.726391582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397924 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25rss\" (UniqueName: \"kubernetes.io/projected/62fd34fc-beed-483e-bd3c-3eeed8239d05-kube-api-access-25rss\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c08ae61-3d4a-4905-94aa-88e03148b073-trusted-ca\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.398762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e14368cf-4d62-407f-b4b4-2318df6a6382-service-ca-bundle\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.399051 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-socket-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.399903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a62c4d-4d44-4b7f-b115-15469a82976e-config\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.400034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/38c940d4-7aae-4661-9c98-aaab303881e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401052 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7bf7ad-087d-4557-8647-02a56285e4c4-config\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/343af78f-ce0c-4feb-a8d9-38c5a524b342-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401445 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/833f574b-27b1-4b3c-a2d9-2e22d1434926-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-client\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38c940d4-7aae-4661-9c98-aaab303881e5-proxy-tls\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401614 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-registration-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x775t\" (UniqueName: \"kubernetes.io/projected/e14368cf-4d62-407f-b4b4-2318df6a6382-kube-api-access-x775t\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-srv-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29a62c4d-4d44-4b7f-b115-15469a82976e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-cert\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401786 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49wf\" (UniqueName: \"kubernetes.io/projected/79747f17-84d8-434a-afdb-c737c276ae90-kube-api-access-k49wf\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4dc\" (UniqueName: \"kubernetes.io/projected/e47b738a-2656-4f75-8ce7-da45f4e17424-kube-api-access-xk4dc\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-config\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29a62c4d-4d44-4b7f-b115-15469a82976e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401949 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/776e4025-0157-4c0d-af55-ae80dcf7250d-tmpfs\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-apiservice-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.401997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-webhook-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq55b\" (UniqueName: \"kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833f574b-27b1-4b3c-a2d9-2e22d1434926-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-stats-auth\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.402373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.403878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/833f574b-27b1-4b3c-a2d9-2e22d1434926-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.404601 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-config\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.408107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/343af78f-ce0c-4feb-a8d9-38c5a524b342-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.408411 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-key\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.410551 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38c940d4-7aae-4661-9c98-aaab303881e5-proxy-tls\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.410692 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-client\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.410698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-registration-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.411173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7bf7ad-087d-4557-8647-02a56285e4c4-serving-cert\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.416514 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-etcd-service-ca\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397457 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397370 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-plugins-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.397448 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-csi-data-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.412691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.412710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.413249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce95f03d-ac27-4976-883b-deedb1c4b5ac-signing-cabundle\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.414125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/776e4025-0157-4c0d-af55-ae80dcf7250d-tmpfs\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.414165 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c08ae61-3d4a-4905-94aa-88e03148b073-trusted-ca\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.411878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e47b738a-2656-4f75-8ce7-da45f4e17424-mountpoint-dir\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.415678 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fd34fc-beed-483e-bd3c-3eeed8239d05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.414957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.416865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aee32aa-36a7-4bf9-80ed-4afdc433746a-config\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.417738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-srv-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.416063 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.418866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-cert\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.419309 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.419730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c08ae61-3d4a-4905-94aa-88e03148b073-metrics-tls\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.419983 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-srv-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.420610 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.420854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79747f17-84d8-434a-afdb-c737c276ae90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.421373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.424370 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-default-certificate\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.425328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.425775 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e65680a-1fd6-41e3-a51a-c5bc7654216f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.428675 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-apiservice-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.434544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.437451 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-metrics-certs\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.439111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29a62c4d-4d44-4b7f-b115-15469a82976e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.439250 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e14368cf-4d62-407f-b4b4-2318df6a6382-stats-auth\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.442598 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e218dd8-5ee2-4355-8304-be35e207d366-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.444527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/776e4025-0157-4c0d-af55-ae80dcf7250d-webhook-cert\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.444824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aee32aa-36a7-4bf9-80ed-4afdc433746a-serving-cert\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.445426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/833f574b-27b1-4b3c-a2d9-2e22d1434926-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.446530 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.457707 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.458141 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9fb648b-fea0-448d-a4d3-d967953806c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.458223 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvjmh\" (UniqueName: \"kubernetes.io/projected/d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0-kube-api-access-qvjmh\") pod \"ingress-canary-wllq2\" (UID: \"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0\") " pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.469934 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2b5b6ae-2351-4e36-a2d0-639c4b777a1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b468l\" (UID: \"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.472844 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.478254 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a7ab2b9-1233-4e5b-b0da-5d43ccc24665-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7vbhs\" (UID: \"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.480342 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.503357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqtq\" (UniqueName: \"kubernetes.io/projected/f9fb648b-fea0-448d-a4d3-d967953806c9-kube-api-access-jkqtq\") pod \"catalog-operator-68c6474976-mjp6w\" (UID: \"f9fb648b-fea0-448d-a4d3-d967953806c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.505900 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.506370 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.006345855 +0000 UTC m=+142.835661059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.520872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jmz\" (UniqueName: \"kubernetes.io/projected/38c940d4-7aae-4661-9c98-aaab303881e5-kube-api-access-48jmz\") pod \"machine-config-controller-84d6567774-6fx4d\" (UID: \"38c940d4-7aae-4661-9c98-aaab303881e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.521966 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c"] Oct 06 08:24:45 crc kubenswrapper[4755]: W1006 08:24:45.534637 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ef001b_4224_45ce_bdca_5865c9092f0e.slice/crio-bb2cafe3c78d783a29f3bc3a41636a09a9f221213ff57c805e3743fe3304f0a0 WatchSource:0}: Error finding container bb2cafe3c78d783a29f3bc3a41636a09a9f221213ff57c805e3743fe3304f0a0: Status 404 returned error can't find the container with id bb2cafe3c78d783a29f3bc3a41636a09a9f221213ff57c805e3743fe3304f0a0 Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.551634 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zbsb\" (UniqueName: \"kubernetes.io/projected/ba7bf7ad-087d-4557-8647-02a56285e4c4-kube-api-access-7zbsb\") pod \"service-ca-operator-777779d784-xp227\" (UID: \"ba7bf7ad-087d-4557-8647-02a56285e4c4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.558622 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wm5h\" (UniqueName: \"kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h\") pod \"route-controller-manager-6576b87f9c-6nnfs\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.559930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:45 crc kubenswrapper[4755]: W1006 08:24:45.568491 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2377494_c95e_4c4e_a37b_b2a7edd85fad.slice/crio-d497dc6e82b1b46c31e0fde523a2349e4f503834106f0b2cc370d35e101f724e WatchSource:0}: Error finding container d497dc6e82b1b46c31e0fde523a2349e4f503834106f0b2cc370d35e101f724e: Status 404 returned error can't find the container with id d497dc6e82b1b46c31e0fde523a2349e4f503834106f0b2cc370d35e101f724e Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.584895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf6s2\" (UniqueName: \"kubernetes.io/projected/343af78f-ce0c-4feb-a8d9-38c5a524b342-kube-api-access-jf6s2\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxwpv\" (UID: \"343af78f-ce0c-4feb-a8d9-38c5a524b342\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.599665 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.604020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-264j8\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-kube-api-access-264j8\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.612315 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.612971 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.112935375 +0000 UTC m=+142.942250589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.621459 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2gl\" (UniqueName: \"kubernetes.io/projected/264bea46-510c-4a6c-ba59-91b0388882de-kube-api-access-cg2gl\") pod \"downloads-7954f5f757-klxzw\" (UID: \"264bea46-510c-4a6c-ba59-91b0388882de\") " pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.649482 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm89v\" (UniqueName: \"kubernetes.io/projected/833f574b-27b1-4b3c-a2d9-2e22d1434926-kube-api-access-wm89v\") pod \"kube-storage-version-migrator-operator-b67b599dd-r52j4\" (UID: \"833f574b-27b1-4b3c-a2d9-2e22d1434926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.649977 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.664019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg6jc\" (UniqueName: \"kubernetes.io/projected/ce95f03d-ac27-4976-883b-deedb1c4b5ac-kube-api-access-zg6jc\") pod \"service-ca-9c57cc56f-kfxvn\" (UID: \"ce95f03d-ac27-4976-883b-deedb1c4b5ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.678607 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdj2\" (UniqueName: \"kubernetes.io/projected/1e65680a-1fd6-41e3-a51a-c5bc7654216f-kube-api-access-2sdj2\") pod \"multus-admission-controller-857f4d67dd-fdr74\" (UID: \"1e65680a-1fd6-41e3-a51a-c5bc7654216f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.687390 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wllq2" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.702028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbd5\" (UniqueName: \"kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5\") pod \"collect-profiles-29328975-gtck4\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.714768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.715475 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.215458705 +0000 UTC m=+143.044773919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.722844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjkp\" (UniqueName: \"kubernetes.io/projected/776e4025-0157-4c0d-af55-ae80dcf7250d-kube-api-access-zqjkp\") pod \"packageserver-d55dfcdfc-njfmq\" (UID: \"776e4025-0157-4c0d-af55-ae80dcf7250d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.736970 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.745903 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.754171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4dc\" (UniqueName: \"kubernetes.io/projected/e47b738a-2656-4f75-8ce7-da45f4e17424-kube-api-access-xk4dc\") pod \"csi-hostpathplugin-hj99z\" (UID: \"e47b738a-2656-4f75-8ce7-da45f4e17424\") " pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.767918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkkr7\" (UniqueName: \"kubernetes.io/projected/1aee32aa-36a7-4bf9-80ed-4afdc433746a-kube-api-access-wkkr7\") pod \"etcd-operator-b45778765-4vctk\" (UID: \"1aee32aa-36a7-4bf9-80ed-4afdc433746a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.783743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" event={"ID":"4d50d581-684f-48fb-86fa-86339fe67de7","Type":"ContainerStarted","Data":"2bd1c0db242f0b1e224f7ce97cb89735483e664f939bfaf3075a3aa05b3e1c3c"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.783795 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" event={"ID":"4d50d581-684f-48fb-86fa-86339fe67de7","Type":"ContainerStarted","Data":"13f7624c3e8353ec812419ff17ae3747632b7644694c5bb7d9d5d0b687b3df1c"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.784941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxhc\" (UniqueName: \"kubernetes.io/projected/8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a-kube-api-access-2hxhc\") pod \"openshift-config-operator-7777fb866f-gvsjx\" (UID: \"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.785587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nrx4l" event={"ID":"d5ef001b-4224-45ce-bdca-5865c9092f0e","Type":"ContainerStarted","Data":"bb2cafe3c78d783a29f3bc3a41636a09a9f221213ff57c805e3743fe3304f0a0"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.790215 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.791321 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" event={"ID":"c98cbede-25b7-40d4-b1ad-18e144e46bcc","Type":"ContainerStarted","Data":"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.791353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" event={"ID":"c98cbede-25b7-40d4-b1ad-18e144e46bcc","Type":"ContainerStarted","Data":"9b6f0d48858e6a81e4877e59a03930926121df05d6b8d8c67e1e623bb9cd576d"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.794374 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.795222 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.799708 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4skj5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.799748 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.801326 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.801926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" event={"ID":"39386f6f-4d16-4a81-9432-e486d9e6ee60","Type":"ContainerStarted","Data":"cf14833dcd3ec7cf5a505541e184f3c0456e5e6fd3a4613f98855d72aff4a6dc"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.801970 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" event={"ID":"39386f6f-4d16-4a81-9432-e486d9e6ee60","Type":"ContainerStarted","Data":"51a1b8bd7cd6a53267000de33f23c3c51ef9cf247c05ef1d2a0b6740c3d89afd"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.801988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" event={"ID":"39386f6f-4d16-4a81-9432-e486d9e6ee60","Type":"ContainerStarted","Data":"136a767c4bfdd2a5169284444c40d0350aff5e8117dfaabc7e63a43b916c0bb6"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.804711 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x775t\" (UniqueName: \"kubernetes.io/projected/e14368cf-4d62-407f-b4b4-2318df6a6382-kube-api-access-x775t\") pod \"router-default-5444994796-zbxjs\" (UID: \"e14368cf-4d62-407f-b4b4-2318df6a6382\") " pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.809903 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" event={"ID":"7c5c24ec-6be2-4b4c-a321-2559254d8158","Type":"ContainerStarted","Data":"cb078074f5eaf9572f9cb54fdc894524d63cc5729b2540998828f90f71459615"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.809968 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" event={"ID":"7c5c24ec-6be2-4b4c-a321-2559254d8158","Type":"ContainerStarted","Data":"2f52d75b57b934bbd807ce1f11b5512da608b3c57d7c709a284a10f60a42cc02"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.812296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" event={"ID":"266418ff-0098-46b7-a0b2-e930a1dfb1d8","Type":"ContainerStarted","Data":"afa8592ecc2a80333d8f7c6a815d050e8292126e53a0c2601db1b1b34432bb98"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.812327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" event={"ID":"266418ff-0098-46b7-a0b2-e930a1dfb1d8","Type":"ContainerStarted","Data":"d00899fe48425f5f9bd9dfe5d088099d75baacdb7e47b39356db0cf27892cb58"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.815300 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.815528 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.315497604 +0000 UTC m=+143.144812818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.815548 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5mh6f" event={"ID":"442d39b2-76cb-4123-a4bc-2dbc8ea62041","Type":"ContainerStarted","Data":"12e28443a9b98082064bb46241fec00e5a544684e70496e863e01d52015330b9"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.815728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.816339 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.316321504 +0000 UTC m=+143.145636718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.821826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" event={"ID":"1b128405-242c-41da-9259-9e6fa646e505","Type":"ContainerStarted","Data":"275ba495053562330eba8f791c86132ab90c8f5a74c7b63cf8ddea934ff25855"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.822604 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rss\" (UniqueName: \"kubernetes.io/projected/62fd34fc-beed-483e-bd3c-3eeed8239d05-kube-api-access-25rss\") pod \"package-server-manager-789f6589d5-lzq7b\" (UID: \"62fd34fc-beed-483e-bd3c-3eeed8239d05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.824792 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.825439 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" event={"ID":"e2377494-c95e-4c4e-a37b-b2a7edd85fad","Type":"ContainerStarted","Data":"d497dc6e82b1b46c31e0fde523a2349e4f503834106f0b2cc370d35e101f724e"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.834217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" event={"ID":"9ab502ea-2cb2-4127-a081-d871168af9aa","Type":"ContainerStarted","Data":"c0f5b818f0b7d7dc958a31b3a8cfbf1887b493f5cc39d1a68f5a48cd9815af5d"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.834309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" event={"ID":"9ab502ea-2cb2-4127-a081-d871168af9aa","Type":"ContainerStarted","Data":"c354af506b68999819960a54326ac5ef0cf582b9bd55530588a2b708c908b788"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.839232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49wf\" (UniqueName: \"kubernetes.io/projected/79747f17-84d8-434a-afdb-c737c276ae90-kube-api-access-k49wf\") pod \"cluster-samples-operator-665b6dd947-dnfbc\" (UID: \"79747f17-84d8-434a-afdb-c737c276ae90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.839619 4755 generic.go:334] "Generic (PLEG): container finished" podID="f262d5f0-ec94-4668-9a49-47616dd4625f" containerID="954971f9d8bafa0afaa83c156eda18cf8979e0c96e23e3c8de7e20cdd7cd2e3e" exitCode=0 Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.839687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" event={"ID":"f262d5f0-ec94-4668-9a49-47616dd4625f","Type":"ContainerDied","Data":"954971f9d8bafa0afaa83c156eda18cf8979e0c96e23e3c8de7e20cdd7cd2e3e"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.839712 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" event={"ID":"f262d5f0-ec94-4668-9a49-47616dd4625f","Type":"ContainerStarted","Data":"aa0d58bd9eee573f0326a4dbb404302121e9e9cbd2fcd0a1aeccdd5e5554b85f"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.845384 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" event={"ID":"eee21fd7-fd7b-4924-ac33-4e086deb424c","Type":"ContainerStarted","Data":"161cb8ee2f019839b337d0a7ff2291aaf13de172d34ebfce37ceb01d434cc8fd"} Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.861704 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqdd\" (UniqueName: \"kubernetes.io/projected/2e218dd8-5ee2-4355-8304-be35e207d366-kube-api-access-dxqdd\") pod \"olm-operator-6b444d44fb-c4fl9\" (UID: \"2e218dd8-5ee2-4355-8304-be35e207d366\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.904169 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.904546 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.904601 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.904977 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.907995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.915916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.916976 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.918700 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.418663479 +0000 UTC m=+143.247978693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.922324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:45 crc kubenswrapper[4755]: E1006 08:24:45.922786 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.422769011 +0000 UTC m=+143.252084225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.938082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29a62c4d-4d44-4b7f-b115-15469a82976e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2c9z7\" (UID: \"29a62c4d-4d44-4b7f-b115-15469a82976e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.938364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c08ae61-3d4a-4905-94aa-88e03148b073-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dzfqk\" (UID: \"7c08ae61-3d4a-4905-94aa-88e03148b073\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.962589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvzg\" (UniqueName: \"kubernetes.io/projected/d67128b0-39d6-49ca-a8f0-337fdb64ef39-kube-api-access-jzvzg\") pod \"migrator-59844c95c7-jsvj5\" (UID: \"d67128b0-39d6-49ca-a8f0-337fdb64ef39\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" Oct 06 08:24:45 crc kubenswrapper[4755]: I1006 08:24:45.963405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq55b\" (UniqueName: \"kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b\") pod \"marketplace-operator-79b997595-zqsmk\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:45.988261 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.021494 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.023024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.023491 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.523471765 +0000 UTC m=+143.352786979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.053922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.062730 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.064480 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.074864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5snnf"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.087374 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.106616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.116726 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.124063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.124501 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.624487158 +0000 UTC m=+143.453802372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.226617 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.726534406 +0000 UTC m=+143.555849620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.226422 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.227029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.227456 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.727445409 +0000 UTC m=+143.556760623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.236925 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.257578 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cnt4g"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.328007 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.329902 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.829869316 +0000 UTC m=+143.659184530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.335188 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.835168368 +0000 UTC m=+143.664483582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.342194 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.368918 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.371144 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.445942 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.446192 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.946161197 +0000 UTC m=+143.775476401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.446645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.447188 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:46.947171182 +0000 UTC m=+143.776486396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.510054 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-42p29" podStartSLOduration=123.510014639 podStartE2EDuration="2m3.510014639s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:46.507706632 +0000 UTC m=+143.337021856" watchObservedRunningTime="2025-10-06 08:24:46.510014639 +0000 UTC m=+143.339329853" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.547697 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.547895 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.047861147 +0000 UTC m=+143.877176361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.548454 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.551962 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.051922887 +0000 UTC m=+143.881238101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.649347 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.650309 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.150285864 +0000 UTC m=+143.979601078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.734458 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w"] Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.756393 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.756778 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.256755112 +0000 UTC m=+144.086070326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: W1006 08:24:46.766709 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9fb648b_fea0_448d_a4d3_d967953806c9.slice/crio-36965b7a7b725693dd3a61d8e0c7df3b1b141038f9e37d9bc78a053ee672c5c7 WatchSource:0}: Error finding container 36965b7a7b725693dd3a61d8e0c7df3b1b141038f9e37d9bc78a053ee672c5c7: Status 404 returned error can't find the container with id 36965b7a7b725693dd3a61d8e0c7df3b1b141038f9e37d9bc78a053ee672c5c7 Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.859257 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.860871 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.36084884 +0000 UTC m=+144.190164054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.863715 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" podStartSLOduration=123.863701701 podStartE2EDuration="2m3.863701701s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:46.862329557 +0000 UTC m=+143.691644781" watchObservedRunningTime="2025-10-06 08:24:46.863701701 +0000 UTC m=+143.693016915" Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.976230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:46 crc kubenswrapper[4755]: E1006 08:24:46.976774 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.476759362 +0000 UTC m=+144.306074576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.991688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" event={"ID":"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe","Type":"ContainerStarted","Data":"abc88590c8289fbdf049191bbdc541aec33c3fd67dd72cbaed01cbd1e639f927"} Oct 06 08:24:46 crc kubenswrapper[4755]: I1006 08:24:46.991745 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" event={"ID":"5aecf36c-e9bc-41d1-b417-d8c81c91cdbe","Type":"ContainerStarted","Data":"9c8674afa6476a38af0c5110ef61b14fb275ada4a2f89fda90d4a4587bea3054"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.020043 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nrx4l" event={"ID":"d5ef001b-4224-45ce-bdca-5865c9092f0e","Type":"ContainerStarted","Data":"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.082102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.082298 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.582278916 +0000 UTC m=+144.411594130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.082849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.083386 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.583357533 +0000 UTC m=+144.412672747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.083832 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" event={"ID":"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e","Type":"ContainerStarted","Data":"74eaa1ec10de12cf220a74f6ffc0ccbd6d885e3b6e4bc592f33c2175aa5da52c"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.113502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" event={"ID":"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665","Type":"ContainerStarted","Data":"575cf86de393a2a431f2b3acda5998cf3de52b4006ad4aee4d21e62a0c6e3d6a"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.137437 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5mh6f" event={"ID":"442d39b2-76cb-4123-a4bc-2dbc8ea62041","Type":"ContainerStarted","Data":"b3a1171c0d911311e126c2dbad80dab68a9a23d11de6ded31303915eedd1afb4"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.166057 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b128405-242c-41da-9259-9e6fa646e505" containerID="5ee7f37195474d2ac627a65d34fffcc07b0e899ee996d1804a24b9cc97c01ca7" exitCode=0 Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.166410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" event={"ID":"1b128405-242c-41da-9259-9e6fa646e505","Type":"ContainerDied","Data":"5ee7f37195474d2ac627a65d34fffcc07b0e899ee996d1804a24b9cc97c01ca7"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.166505 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" event={"ID":"1b128405-242c-41da-9259-9e6fa646e505","Type":"ContainerStarted","Data":"366223e9a32639aad96a16361164b50e8a1715cfc391330249643a99381344e6"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.169999 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2lqpg" podStartSLOduration=124.169963288 podStartE2EDuration="2m4.169963288s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.116174245 +0000 UTC m=+143.945489479" watchObservedRunningTime="2025-10-06 08:24:47.169963288 +0000 UTC m=+143.999278502" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.185401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.186857 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.686830776 +0000 UTC m=+144.516145990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.190761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnt4g" event={"ID":"84e010ca-d47c-40d6-8b18-d67164e60d0b","Type":"ContainerStarted","Data":"f7e7098bb557d94e8bdf072c24c3dfe478f7c659babbf5cb794b1ece2d7617f7"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.202808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" event={"ID":"92199f0a-b1db-438f-8e44-446e840f07cf","Type":"ContainerStarted","Data":"bfd8897102d298c46116f492287ce692f6d78f29b912464992c72c149d1dce46"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.205943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5snnf" event={"ID":"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f","Type":"ContainerStarted","Data":"7a6391d6e5d34044cee4ec1d129d64da6e974b7ce642fcab9721354540832938"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.207094 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.220671 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-5snnf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.220755 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5snnf" podUID="e61e6c52-261a-4ca9-b4aa-3da462aa4e7f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.222958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" event={"ID":"7c5c24ec-6be2-4b4c-a321-2559254d8158","Type":"ContainerStarted","Data":"5c8c9ae2c4a74643ff23a44ccd2598c3b447e2b980a471d40ec15d29e1f6c561"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.239582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" event={"ID":"4d50d581-684f-48fb-86fa-86339fe67de7","Type":"ContainerStarted","Data":"46a79465bb9d1cb554bf628d6394e55db7369664a3f1ed3e01ef3ab3e80bf1ac"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.269751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zbxjs" event={"ID":"e14368cf-4d62-407f-b4b4-2318df6a6382","Type":"ContainerStarted","Data":"ba6f930944256bb35cab6c17be21febb8189466107f00da423d6c04b9bc7fba3"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.289755 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" event={"ID":"f9fb648b-fea0-448d-a4d3-d967953806c9","Type":"ContainerStarted","Data":"36965b7a7b725693dd3a61d8e0c7df3b1b141038f9e37d9bc78a053ee672c5c7"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.290686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.292220 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.792205337 +0000 UTC m=+144.621520551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.298431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" event={"ID":"e2377494-c95e-4c4e-a37b-b2a7edd85fad","Type":"ContainerStarted","Data":"4148e963aed8eeeec2f2f0fe8e0e68aee364332cf35076196d1e1fa9c1df11a6"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.298506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" event={"ID":"e2377494-c95e-4c4e-a37b-b2a7edd85fad","Type":"ContainerStarted","Data":"2c4368ffe08217931ee54a6094c2b80982d2c8d3c25ff9a21694a0ca6754e64e"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.327483 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" event={"ID":"eee21fd7-fd7b-4924-ac33-4e086deb424c","Type":"ContainerStarted","Data":"6b655f0e6f4c04e2baff485031a3165c8418039cbba21e5b182bd44dd2d9db1d"} Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.354118 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.393451 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.394934 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.894874841 +0000 UTC m=+144.724190055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.464248 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bj8wv" podStartSLOduration=124.464223528 podStartE2EDuration="2m4.464223528s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.46225814 +0000 UTC m=+144.291573354" watchObservedRunningTime="2025-10-06 08:24:47.464223528 +0000 UTC m=+144.293538742" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.496197 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.499332 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:47.999313207 +0000 UTC m=+144.828628421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.598112 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.598805 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.098771691 +0000 UTC m=+144.928086905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.599105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.599718 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.099700655 +0000 UTC m=+144.929015869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.701592 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.702077 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.20206003 +0000 UTC m=+145.031375244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.803796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.818074 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.318052434 +0000 UTC m=+145.147367648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.803951 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" podStartSLOduration=124.803927984 podStartE2EDuration="2m4.803927984s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.740053631 +0000 UTC m=+144.569368845" watchObservedRunningTime="2025-10-06 08:24:47.803927984 +0000 UTC m=+144.633243198" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.857267 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4vctk"] Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.875850 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nrx4l" podStartSLOduration=124.875815545 podStartE2EDuration="2m4.875815545s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.824707979 +0000 UTC m=+144.654023193" watchObservedRunningTime="2025-10-06 08:24:47.875815545 +0000 UTC m=+144.705130759" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.889159 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpgbv" podStartSLOduration=124.889138215 podStartE2EDuration="2m4.889138215s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.879313112 +0000 UTC m=+144.708628336" watchObservedRunningTime="2025-10-06 08:24:47.889138215 +0000 UTC m=+144.718453429" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.914793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:47 crc kubenswrapper[4755]: E1006 08:24:47.915192 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.41516696 +0000 UTC m=+145.244482174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.920154 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.952031 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv"] Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.952938 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:47 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:47 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:47 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:47 crc kubenswrapper[4755]: I1006 08:24:47.953016 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.018156 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.018671 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.518654594 +0000 UTC m=+145.347969808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.030197 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tr47c" podStartSLOduration=125.030174029 podStartE2EDuration="2m5.030174029s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:47.956174035 +0000 UTC m=+144.785489249" watchObservedRunningTime="2025-10-06 08:24:48.030174029 +0000 UTC m=+144.859489243" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.031395 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wllq2"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.054372 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5mh6f" podStartSLOduration=6.054354568 podStartE2EDuration="6.054354568s" podCreationTimestamp="2025-10-06 08:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.05319424 +0000 UTC m=+144.882509454" watchObservedRunningTime="2025-10-06 08:24:48.054354568 +0000 UTC m=+144.883669782" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.098553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.102029 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-74nwf" podStartSLOduration=125.101999979 podStartE2EDuration="2m5.101999979s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.090036722 +0000 UTC m=+144.919351936" watchObservedRunningTime="2025-10-06 08:24:48.101999979 +0000 UTC m=+144.931315193" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.122265 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.122767 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.622752692 +0000 UTC m=+145.452067906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.135110 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-klxzw"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.200487 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5snnf" podStartSLOduration=125.200461678 podStartE2EDuration="2m5.200461678s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.174054353 +0000 UTC m=+145.003369567" watchObservedRunningTime="2025-10-06 08:24:48.200461678 +0000 UTC m=+145.029776892" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.223724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.224090 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.724077973 +0000 UTC m=+145.553393187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.227241 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xp227"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.233585 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4wgdb" podStartSLOduration=125.233543838 podStartE2EDuration="2m5.233543838s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.221486019 +0000 UTC m=+145.050801233" watchObservedRunningTime="2025-10-06 08:24:48.233543838 +0000 UTC m=+145.062859052" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.284372 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zbxjs" podStartSLOduration=125.284349206 podStartE2EDuration="2m5.284349206s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.27442202 +0000 UTC m=+145.103737234" watchObservedRunningTime="2025-10-06 08:24:48.284349206 +0000 UTC m=+145.113664420" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.287822 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.326693 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.327277 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.827260838 +0000 UTC m=+145.656576052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.342765 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.397994 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tmlcx" podStartSLOduration=125.397972741 podStartE2EDuration="2m5.397972741s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.396988626 +0000 UTC m=+145.226303840" watchObservedRunningTime="2025-10-06 08:24:48.397972741 +0000 UTC m=+145.227287955" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.431388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" event={"ID":"1aee32aa-36a7-4bf9-80ed-4afdc433746a","Type":"ContainerStarted","Data":"ade9d1603be1a647f7ab89906288b42a949dcf7f6a3fd8a3d836d779d6331bad"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.431959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.432265 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:48.932251649 +0000 UTC m=+145.761566853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.487046 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" event={"ID":"8a7ab2b9-1233-4e5b-b0da-5d43ccc24665","Type":"ContainerStarted","Data":"c7774442e4bbe86d4469a85ecd674c92327382281b0dd913599f3c0b6a69c838"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.512520 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zbxjs" event={"ID":"e14368cf-4d62-407f-b4b4-2318df6a6382","Type":"ContainerStarted","Data":"7d36890c0b8ec2f50263fdef3d8a73a846b4a6af70704be87923910177f1d636"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.514958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" event={"ID":"343af78f-ce0c-4feb-a8d9-38c5a524b342","Type":"ContainerStarted","Data":"0594605a1291fde9009dde20aafbdc040914c002ce434462c3c6a10831f11702"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.527511 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7vbhs" podStartSLOduration=125.527493049 podStartE2EDuration="2m5.527493049s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.526256919 +0000 UTC m=+145.355572133" watchObservedRunningTime="2025-10-06 08:24:48.527493049 +0000 UTC m=+145.356808263" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.531467 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.535628 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.535721 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.035700672 +0000 UTC m=+145.865015886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.545209 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.545629 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.045616798 +0000 UTC m=+145.874932012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.558721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnt4g" event={"ID":"84e010ca-d47c-40d6-8b18-d67164e60d0b","Type":"ContainerStarted","Data":"7e7d9b1e4f814b961e6e0a760882a82bb3cdbc2491c3bad58dc6dceaeba07846"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.562804 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.575541 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hj99z"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.606098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" event={"ID":"92199f0a-b1db-438f-8e44-446e840f07cf","Type":"ContainerStarted","Data":"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.607007 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.637957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" event={"ID":"b2b5b6ae-2351-4e36-a2d0-639c4b777a1e","Type":"ContainerStarted","Data":"7711b16acf4e07677655d2f48c88e347566636f43b2b4526d4c2067687965070"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.641671 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" podStartSLOduration=125.641642917 podStartE2EDuration="2m5.641642917s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.638865399 +0000 UTC m=+145.468180613" watchObservedRunningTime="2025-10-06 08:24:48.641642917 +0000 UTC m=+145.470958131" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.646352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.647824 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.147793479 +0000 UTC m=+145.977108693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.657792 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" event={"ID":"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227","Type":"ContainerStarted","Data":"52c738f322f5c636d9687df7528b7e9355ad72e7265c90dde8163fc60a00f320"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.663942 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b468l" podStartSLOduration=125.663917359 podStartE2EDuration="2m5.663917359s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.663086718 +0000 UTC m=+145.492401932" watchObservedRunningTime="2025-10-06 08:24:48.663917359 +0000 UTC m=+145.493232573" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.707353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wllq2" event={"ID":"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0","Type":"ContainerStarted","Data":"dbee7938d0dfa5399c4837b2ba8d8e102fb0b8f8ec0a1e75a262be1d4d17dd5e"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.716439 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.740220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" event={"ID":"f9fb648b-fea0-448d-a4d3-d967953806c9","Type":"ContainerStarted","Data":"76a96c35a5eae8e6e38860b9597eb703be2e4be1d9889a76fe3f3de7c7c09ad9"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.741367 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.749414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.754385 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.25436003 +0000 UTC m=+146.083675434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.761744 4755 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mjp6w container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.761804 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" podUID="f9fb648b-fea0-448d-a4d3-d967953806c9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.780547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.798348 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fdr74"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.807112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" event={"ID":"f262d5f0-ec94-4668-9a49-47616dd4625f","Type":"ContainerStarted","Data":"1531b3c0a500813864db0c0eb0ae763a7da44f48183be91083f5eff13eb29e18"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.807432 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.807453 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" event={"ID":"f262d5f0-ec94-4668-9a49-47616dd4625f","Type":"ContainerStarted","Data":"367a0d346dd2af5623bb6a3a6316a4544cac9082ccb900f2d9236cbada074621"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.827707 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" podStartSLOduration=125.827687377 podStartE2EDuration="2m5.827687377s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.802222936 +0000 UTC m=+145.631538170" watchObservedRunningTime="2025-10-06 08:24:48.827687377 +0000 UTC m=+145.657002591" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.828545 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.841169 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klxzw" event={"ID":"264bea46-510c-4a6c-ba59-91b0388882de","Type":"ContainerStarted","Data":"e3a39168d879e985cba9265f9cad8d6dd282da5d6b405a19b044aecfccb9cfa4"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.841215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.867930 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.868676 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.368645581 +0000 UTC m=+146.197960795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.868947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.869920 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.369910832 +0000 UTC m=+146.199226036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.875967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5snnf" event={"ID":"e61e6c52-261a-4ca9-b4aa-3da462aa4e7f","Type":"ContainerStarted","Data":"8be14308ddbcc19a76c98dca651a4bbeaeeb9069d01545511bb5da250b1b39b7"} Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.919963 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:48 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:48 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:48 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.920041 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.920129 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.920146 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.965857 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" podStartSLOduration=125.965832489 podStartE2EDuration="2m5.965832489s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:48.857384972 +0000 UTC m=+145.686700186" watchObservedRunningTime="2025-10-06 08:24:48.965832489 +0000 UTC m=+145.795147703" Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.968656 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfxvn"] Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.969648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.970396 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.470379422 +0000 UTC m=+146.299694636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.971504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:48 crc kubenswrapper[4755]: E1006 08:24:48.972196 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.472173296 +0000 UTC m=+146.301488510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:48 crc kubenswrapper[4755]: I1006 08:24:48.998632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc"] Oct 06 08:24:49 crc kubenswrapper[4755]: W1006 08:24:49.018030 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce95f03d_ac27_4976_883b_deedb1c4b5ac.slice/crio-b0bdf1e0e626e3a4fdeb64dcaac397cd51472b4475c1163476c26989e71dd389 WatchSource:0}: Error finding container b0bdf1e0e626e3a4fdeb64dcaac397cd51472b4475c1163476c26989e71dd389: Status 404 returned error can't find the container with id b0bdf1e0e626e3a4fdeb64dcaac397cd51472b4475c1163476c26989e71dd389 Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.069012 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4"] Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.074822 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.076331 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.576284095 +0000 UTC m=+146.405599309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.100272 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk"] Oct 06 08:24:49 crc kubenswrapper[4755]: W1006 08:24:49.151313 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833f574b_27b1_4b3c_a2d9_2e22d1434926.slice/crio-30bf8bcb25e975c58a7887d2fdfeb408f5ab3673aad8e1bd8c1f9849476b18e4 WatchSource:0}: Error finding container 30bf8bcb25e975c58a7887d2fdfeb408f5ab3673aad8e1bd8c1f9849476b18e4: Status 404 returned error can't find the container with id 30bf8bcb25e975c58a7887d2fdfeb408f5ab3673aad8e1bd8c1f9849476b18e4 Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.176448 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.176878 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.676864127 +0000 UTC m=+146.506179341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: W1006 08:24:49.184922 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c08ae61_3d4a_4905_94aa_88e03148b073.slice/crio-f7cfd5a71f71ee4a8e2ae26dbe69fe4c476eb7f32dba702783c9e7bf6d7029e3 WatchSource:0}: Error finding container f7cfd5a71f71ee4a8e2ae26dbe69fe4c476eb7f32dba702783c9e7bf6d7029e3: Status 404 returned error can't find the container with id f7cfd5a71f71ee4a8e2ae26dbe69fe4c476eb7f32dba702783c9e7bf6d7029e3 Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.278876 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.279603 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.779585332 +0000 UTC m=+146.608900536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.383201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.383627 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.883613559 +0000 UTC m=+146.712928773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.406850 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.484623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.485384 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:49.98534622 +0000 UTC m=+146.814661434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.592510 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.593524 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.093503589 +0000 UTC m=+146.922818813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.637886 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.638722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.696510 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.696876 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.196840649 +0000 UTC m=+147.026155863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.697098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.697602 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.197583297 +0000 UTC m=+147.026898511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.801283 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.801868 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.3018427 +0000 UTC m=+147.131157914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.875120 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-5snnf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.875207 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5snnf" podUID="e61e6c52-261a-4ca9-b4aa-3da462aa4e7f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.903117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:49 crc kubenswrapper[4755]: E1006 08:24:49.903617 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.40359493 +0000 UTC m=+147.232910144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.915457 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:49 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:49 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:49 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.915522 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.920985 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.921043 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.921123 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.963317 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" event={"ID":"343af78f-ce0c-4feb-a8d9-38c5a524b342","Type":"ContainerStarted","Data":"edad309d96bcc98a6ab75aef8296b054ca5cf942f07e933bcb59dac831e13e67"} Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.972423 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" event={"ID":"29a62c4d-4d44-4b7f-b115-15469a82976e","Type":"ContainerStarted","Data":"1f433ada287f5f616c43aa069b53703cbc7ac60b37dc9582fd33dbe4cce96340"} Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.990272 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" event={"ID":"833f574b-27b1-4b3c-a2d9-2e22d1434926","Type":"ContainerStarted","Data":"30bf8bcb25e975c58a7887d2fdfeb408f5ab3673aad8e1bd8c1f9849476b18e4"} Oct 06 08:24:49 crc kubenswrapper[4755]: I1006 08:24:49.994264 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" event={"ID":"1e65680a-1fd6-41e3-a51a-c5bc7654216f","Type":"ContainerStarted","Data":"475b0a242273d2cb9f986e9a337d13f94501ef7e43f646257f6cbd11a9cb3472"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.003607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" event={"ID":"e47b738a-2656-4f75-8ce7-da45f4e17424","Type":"ContainerStarted","Data":"c9dc293efe08764226caa6189178e658176b7a5315a47907c8196684f9c4d583"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.005758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.006939 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.50691805 +0000 UTC m=+147.336233264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.031926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" event={"ID":"7c08ae61-3d4a-4905-94aa-88e03148b073","Type":"ContainerStarted","Data":"f7cfd5a71f71ee4a8e2ae26dbe69fe4c476eb7f32dba702783c9e7bf6d7029e3"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.034680 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" event={"ID":"c15c418e-734c-43df-b3e2-20619f626df3","Type":"ContainerStarted","Data":"2cf4dbde3203b3f58f2c028fb4eb4fad84516b538c4525eecfe731f22e2955e3"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.055007 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" event={"ID":"d67128b0-39d6-49ca-a8f0-337fdb64ef39","Type":"ContainerStarted","Data":"28aaa3e91356228c9cfdb867e7e9cb06daf6043a6ef62ef93cb70d82871cd0da"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.055070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" event={"ID":"d67128b0-39d6-49ca-a8f0-337fdb64ef39","Type":"ContainerStarted","Data":"c308aef527f1a07c0eb2487bd23af4b7dcedfe0a62f601d7da600b564a1dd66d"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.087371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" event={"ID":"776e4025-0157-4c0d-af55-ae80dcf7250d","Type":"ContainerStarted","Data":"cd3d7d2d6690987b7071d835933ac28609a1fd1b140528c46bbf8ee12517951a"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.087901 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" event={"ID":"776e4025-0157-4c0d-af55-ae80dcf7250d","Type":"ContainerStarted","Data":"6a66b54c0264e02644c9ac8a13f5f1599b08974982ebc13df916364fd3a3f3a8"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.088343 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.088461 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxwpv" podStartSLOduration=127.08843748 podStartE2EDuration="2m7.08843748s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.087130837 +0000 UTC m=+146.916446051" watchObservedRunningTime="2025-10-06 08:24:50.08843748 +0000 UTC m=+146.917752694" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.099687 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-njfmq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" start-of-body= Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.099746 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" podUID="776e4025-0157-4c0d-af55-ae80dcf7250d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.099937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" event={"ID":"ce95f03d-ac27-4976-883b-deedb1c4b5ac","Type":"ContainerStarted","Data":"81270f53f3580117307b5d46073aae36a76e69783af073fc5666272da39cdd4e"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.099972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" event={"ID":"ce95f03d-ac27-4976-883b-deedb1c4b5ac","Type":"ContainerStarted","Data":"b0bdf1e0e626e3a4fdeb64dcaac397cd51472b4475c1163476c26989e71dd389"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.109691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.110459 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.610433715 +0000 UTC m=+147.439748929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.125156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" event={"ID":"38c940d4-7aae-4661-9c98-aaab303881e5","Type":"ContainerStarted","Data":"de2e9d646fb3aa80c3f4936f47eb0f506dac90575ffde6a9fbc94894164f1052"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.125238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" event={"ID":"38c940d4-7aae-4661-9c98-aaab303881e5","Type":"ContainerStarted","Data":"12e129b2e7313a49010bbe9d10feb6b40df502260ea849fcc4d11a4570e1c926"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.129427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" event={"ID":"ba7bf7ad-087d-4557-8647-02a56285e4c4","Type":"ContainerStarted","Data":"d002f20ebaa36ac4cfb5adc8aac4d2d921506da90794036ce5d28321a65f0e15"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.129480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" event={"ID":"ba7bf7ad-087d-4557-8647-02a56285e4c4","Type":"ContainerStarted","Data":"c2babb7f54f46757886b7ec2d9646e51c7e7cb3202451cd2536309693f64dfe1"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.147973 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" event={"ID":"79747f17-84d8-434a-afdb-c737c276ae90","Type":"ContainerStarted","Data":"6d6592e918019fd2242b4c5d122fd73665b828d1a62f263d4c93e2a532729261"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.164406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wllq2" event={"ID":"d4ac0ad4-4c6d-4a28-84a7-c5fb44de75d0","Type":"ContainerStarted","Data":"2433a4dac37ace65a453dffec26ca5c4b13b69b79a5426c371dcf916d4cf5890"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.181517 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kfxvn" podStartSLOduration=127.181495566 podStartE2EDuration="2m7.181495566s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.179171788 +0000 UTC m=+147.008487002" watchObservedRunningTime="2025-10-06 08:24:50.181495566 +0000 UTC m=+147.010810780" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.181940 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" podStartSLOduration=127.181936066 podStartE2EDuration="2m7.181936066s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.148390735 +0000 UTC m=+146.977705959" watchObservedRunningTime="2025-10-06 08:24:50.181936066 +0000 UTC m=+147.011251280" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.186854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnt4g" event={"ID":"84e010ca-d47c-40d6-8b18-d67164e60d0b","Type":"ContainerStarted","Data":"ba8513806195eb9c5c4fcb2a11f9a389b228a7968fc3f2676ad2e6642df2be27"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.189265 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.210888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klxzw" event={"ID":"264bea46-510c-4a6c-ba59-91b0388882de","Type":"ContainerStarted","Data":"3366932c2fdd02add2282485aa3a5396635b5b5b56a3b13a02b516568867b987"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.210963 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.212554 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.712529744 +0000 UTC m=+147.541844958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.213000 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.226703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" event={"ID":"1aee32aa-36a7-4bf9-80ed-4afdc433746a","Type":"ContainerStarted","Data":"fd36d9792b76a932ea7c85d4a43e38f8a7010b178af732387a59832308aad1aa"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.237625 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.237695 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.258196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" event={"ID":"62fd34fc-beed-483e-bd3c-3eeed8239d05","Type":"ContainerStarted","Data":"fc9aa1bc90a48964b5c98914b9fcc12d15954d05223e2b176d885b8f1cbe5fff"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.258255 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" event={"ID":"62fd34fc-beed-483e-bd3c-3eeed8239d05","Type":"ContainerStarted","Data":"f512b32019c4e2638dad1d39567702cb8f956c8980516c5b0629b5274a9e4697"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.262611 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wllq2" podStartSLOduration=8.25999825 podStartE2EDuration="8.25999825s" podCreationTimestamp="2025-10-06 08:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.25757433 +0000 UTC m=+147.086889544" watchObservedRunningTime="2025-10-06 08:24:50.25999825 +0000 UTC m=+147.089313464" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.263161 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xp227" podStartSLOduration=127.263148348 podStartE2EDuration="2m7.263148348s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.223036554 +0000 UTC m=+147.052351768" watchObservedRunningTime="2025-10-06 08:24:50.263148348 +0000 UTC m=+147.092463562" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.297302 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" event={"ID":"2e218dd8-5ee2-4355-8304-be35e207d366","Type":"ContainerStarted","Data":"6c5c9cb664bc5be4edd0ddfa664efe20e299f49f251f93e30c2446dbce3fef24"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.297374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" event={"ID":"2e218dd8-5ee2-4355-8304-be35e207d366","Type":"ContainerStarted","Data":"1b6af6a5203ca5e378664b80c6ca472790372b8f6e5de89efa184e3dcbe0b7fb"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.305434 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.314096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.314421 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cnt4g" podStartSLOduration=8.314399788 podStartE2EDuration="8.314399788s" podCreationTimestamp="2025-10-06 08:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.313936416 +0000 UTC m=+147.143251630" watchObservedRunningTime="2025-10-06 08:24:50.314399788 +0000 UTC m=+147.143715002" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.315658 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.815643279 +0000 UTC m=+147.644958493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.316323 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" event={"ID":"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a","Type":"ContainerStarted","Data":"0bc94119430468fc59e74b1502e87dcd974c25a5e080780367b1936b181f370c"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.316385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" event={"ID":"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a","Type":"ContainerStarted","Data":"811f4e5af5e55222851f738bfc9991b9b65c79d06ddfb8afbb9dccc9794447a1"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.326368 4755 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4fl9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.326440 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" podUID="2e218dd8-5ee2-4355-8304-be35e207d366" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.364337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" event={"ID":"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227","Type":"ContainerStarted","Data":"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.364899 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.409215 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-klxzw" podStartSLOduration=127.409189426 podStartE2EDuration="2m7.409189426s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.349271582 +0000 UTC m=+147.178586796" watchObservedRunningTime="2025-10-06 08:24:50.409189426 +0000 UTC m=+147.238504640" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.418413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.420442 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:50.920425294 +0000 UTC m=+147.749740508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.422347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" event={"ID":"960d9d23-73b6-49b2-8772-eca49d507f2f","Type":"ContainerStarted","Data":"e3d0158ad413c7ece37e51fb53945de4b694147a2824fe4616b8f94a2aac7cec"} Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.457207 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" podStartSLOduration=127.457175195 podStartE2EDuration="2m7.457175195s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.412013986 +0000 UTC m=+147.241329200" watchObservedRunningTime="2025-10-06 08:24:50.457175195 +0000 UTC m=+147.286490409" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.457401 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.458831 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4vctk" podStartSLOduration=127.458824976 podStartE2EDuration="2m7.458824976s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.456314854 +0000 UTC m=+147.285630078" watchObservedRunningTime="2025-10-06 08:24:50.458824976 +0000 UTC m=+147.288140190" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.465972 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mjp6w" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.484794 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zqsmk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.484912 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.512313 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h5272" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.535014 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" podStartSLOduration=127.534990713 podStartE2EDuration="2m7.534990713s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.533175668 +0000 UTC m=+147.362490872" watchObservedRunningTime="2025-10-06 08:24:50.534990713 +0000 UTC m=+147.364305927" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.550751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.552279 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.052250881 +0000 UTC m=+147.881566095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.600539 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" podStartSLOduration=127.600516866 podStartE2EDuration="2m7.600516866s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:50.599112111 +0000 UTC m=+147.428427325" watchObservedRunningTime="2025-10-06 08:24:50.600516866 +0000 UTC m=+147.429832080" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.652617 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.653614 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.153594181 +0000 UTC m=+147.982909395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.657625 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5snnf" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.752064 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.753107 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.758000 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.758673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.759087 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.259074004 +0000 UTC m=+148.088389218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.772442 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.793854 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.860553 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.860757 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.860850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r4sb\" (UniqueName: \"kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.860893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.861010 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.360992979 +0000 UTC m=+148.190308193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.920747 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:50 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:50 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:50 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.920806 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.946683 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.948314 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.956541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.959687 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970603 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970657 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.970721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r4sb\" (UniqueName: \"kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.976499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: E1006 08:24:50.976903 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.47688408 +0000 UTC m=+148.306199294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.977215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.977649 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.982234 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.982892 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:50 crc kubenswrapper[4755]: I1006 08:24:50.987893 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.000460 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r4sb\" (UniqueName: \"kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb\") pod \"certified-operators-9jnxh\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.074061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.074266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6nm\" (UniqueName: \"kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.074318 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.074366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.074471 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.574456408 +0000 UTC m=+148.403771622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.099472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.113134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.124274 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.136372 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.137519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.149669 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.173290 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6nm\" (UniqueName: \"kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176273 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176299 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvlw\" (UniqueName: \"kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.176340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.176991 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.676967607 +0000 UTC m=+148.506282821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.177109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.177388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.244596 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6nm\" (UniqueName: \"kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm\") pod \"community-operators-dzsrh\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.281435 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.281754 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.281791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvlw\" (UniqueName: \"kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.281817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.282352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.282460 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.78243535 +0000 UTC m=+148.611750564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.282725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.310423 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvlw\" (UniqueName: \"kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw\") pod \"certified-operators-m2hkf\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.326329 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.327341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.342019 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.364941 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.385423 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.385834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.385855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvng\" (UniqueName: \"kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.385888 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.386184 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.88617141 +0000 UTC m=+148.715486624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.492490 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.492863 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.492883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvng\" (UniqueName: \"kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.492917 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.493366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.493432 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:51.993416347 +0000 UTC m=+148.822731561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.493668 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.545469 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.572144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" event={"ID":"62fd34fc-beed-483e-bd3c-3eeed8239d05","Type":"ContainerStarted","Data":"3c688597fc3932bc6e0e5983869c09f86a80ba29f09ddf1b1d5f32592c4f1280"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.572237 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.597933 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.601387 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.101362261 +0000 UTC m=+148.930677475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.612343 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvng\" (UniqueName: \"kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng\") pod \"community-operators-wnmvf\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.642785 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" podStartSLOduration=128.642754596 podStartE2EDuration="2m8.642754596s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.62231546 +0000 UTC m=+148.451630674" watchObservedRunningTime="2025-10-06 08:24:51.642754596 +0000 UTC m=+148.472069810" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.647157 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.657257 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" event={"ID":"38c940d4-7aae-4661-9c98-aaab303881e5","Type":"ContainerStarted","Data":"9386d3fa6eaddaa51e39d1a4352b68d2972bee302a4fb87856427d7313c81da7"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.708518 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.709833 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.209812908 +0000 UTC m=+149.039128112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.717779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" event={"ID":"29a62c4d-4d44-4b7f-b115-15469a82976e","Type":"ContainerStarted","Data":"f76dbe3457a55b69db5e9b9e4c0db3502ba4e0a1316e0ab82a9adc654f3ea5c0"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.720068 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" event={"ID":"79747f17-84d8-434a-afdb-c737c276ae90","Type":"ContainerStarted","Data":"ed07b8f1bb278a9dd8eec8b1e9144fbbb07a8af72b6b15c625649381bde42362"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.720107 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" event={"ID":"79747f17-84d8-434a-afdb-c737c276ae90","Type":"ContainerStarted","Data":"91eb40f4e5ee9271d8d75ec995874a81a4ee414df6371d60d0eed0857c9d9a89"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.743942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" event={"ID":"960d9d23-73b6-49b2-8772-eca49d507f2f","Type":"ContainerStarted","Data":"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.745327 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zqsmk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.745370 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.774051 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" event={"ID":"7c08ae61-3d4a-4905-94aa-88e03148b073","Type":"ContainerStarted","Data":"eb4d44aa5b02219b582b6483da31a38ad6a07e46c5260c72d737f42ced9efd52"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.774117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" event={"ID":"7c08ae61-3d4a-4905-94aa-88e03148b073","Type":"ContainerStarted","Data":"0095fa9f9af63ae5d29ee4c77ef734b9fc079ef78d09e8be7e9fd14cd7853064"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.809859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.811712 4755 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hztlt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]log ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]etcd ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/max-in-flight-filter ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 06 08:24:51 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 06 08:24:51 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectcache ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 06 08:24:51 crc kubenswrapper[4755]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 06 08:24:51 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 08:24:51 crc kubenswrapper[4755]: livez check failed Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.811806 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" podUID="f262d5f0-ec94-4668-9a49-47616dd4625f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.812790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" event={"ID":"c15c418e-734c-43df-b3e2-20619f626df3","Type":"ContainerStarted","Data":"0c87ae79e9b83043a5b0307bf5688d48bba48ad335a04dbf92126cf54b4deb80"} Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.813137 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.313122077 +0000 UTC m=+149.142437291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.837610 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6fx4d" podStartSLOduration=128.837583644 podStartE2EDuration="2m8.837583644s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.716899523 +0000 UTC m=+148.546214737" watchObservedRunningTime="2025-10-06 08:24:51.837583644 +0000 UTC m=+148.666898858" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.838409 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2c9z7" podStartSLOduration=128.838403323 podStartE2EDuration="2m8.838403323s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.837197483 +0000 UTC m=+148.666512697" watchObservedRunningTime="2025-10-06 08:24:51.838403323 +0000 UTC m=+148.667718537" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.850175 4755 generic.go:334] "Generic (PLEG): container finished" podID="8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a" containerID="0bc94119430468fc59e74b1502e87dcd974c25a5e080780367b1936b181f370c" exitCode=0 Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.850312 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" event={"ID":"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a","Type":"ContainerDied","Data":"0bc94119430468fc59e74b1502e87dcd974c25a5e080780367b1936b181f370c"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.850357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" event={"ID":"8d11b5e1-1a83-4ee3-a8b2-d191c97ddb6a","Type":"ContainerStarted","Data":"e90a63f73085e8f4d96fa6559d7d585e9fc7ead00866a5ac0f04d9e19febbc1d"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.851322 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.893825 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dzfqk" podStartSLOduration=128.893801506 podStartE2EDuration="2m8.893801506s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.891804266 +0000 UTC m=+148.721119480" watchObservedRunningTime="2025-10-06 08:24:51.893801506 +0000 UTC m=+148.723116720" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.911550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.912067 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.412023498 +0000 UTC m=+149.241338712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.912158 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:51 crc kubenswrapper[4755]: E1006 08:24:51.914118 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.414109039 +0000 UTC m=+149.243424253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.929226 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:51 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:51 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:51 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.929277 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.932110 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dnfbc" podStartSLOduration=128.932097735 podStartE2EDuration="2m8.932097735s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.929102911 +0000 UTC m=+148.758418125" watchObservedRunningTime="2025-10-06 08:24:51.932097735 +0000 UTC m=+148.761412949" Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.972207 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" event={"ID":"833f574b-27b1-4b3c-a2d9-2e22d1434926","Type":"ContainerStarted","Data":"90bd05866eeae25e71fbb25f336d3f644914a11e3944a08154bf8388ea48b120"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.972704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" event={"ID":"1e65680a-1fd6-41e3-a51a-c5bc7654216f","Type":"ContainerStarted","Data":"c16aec59e33048f89be256a76b6729184cd212f978cb423dc651bafb10dcb351"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.972717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" event={"ID":"1e65680a-1fd6-41e3-a51a-c5bc7654216f","Type":"ContainerStarted","Data":"c5eabc3de13d85f1aad609c555b78d377dc756bff0cfb31229525dc5e1b8b912"} Oct 06 08:24:51 crc kubenswrapper[4755]: I1006 08:24:51.983601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" event={"ID":"d67128b0-39d6-49ca-a8f0-337fdb64ef39","Type":"ContainerStarted","Data":"86226af53d6aea68c9ca499757aeaccd1c9af3e2bfbc85681c0642dced0fbfd3"} Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.013582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" event={"ID":"e47b738a-2656-4f75-8ce7-da45f4e17424","Type":"ContainerStarted","Data":"e0bf85e1e3778035b14cb300037372010fb2652b4e38d3b08869b00f8173d3c3"} Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.016346 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.016414 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.022044 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.022118 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.522089654 +0000 UTC m=+149.351404868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.022658 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.024268 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.524251478 +0000 UTC m=+149.353566692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.046187 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" podStartSLOduration=129.046161131 podStartE2EDuration="2m9.046161131s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:51.998815718 +0000 UTC m=+148.828130952" watchObservedRunningTime="2025-10-06 08:24:52.046161131 +0000 UTC m=+148.875476345" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.046319 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r52j4" podStartSLOduration=129.046310285 podStartE2EDuration="2m9.046310285s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:52.043841903 +0000 UTC m=+148.873157117" watchObservedRunningTime="2025-10-06 08:24:52.046310285 +0000 UTC m=+148.875625499" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.131876 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fdr74" podStartSLOduration=129.131845304 podStartE2EDuration="2m9.131845304s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:52.091231707 +0000 UTC m=+148.920546921" watchObservedRunningTime="2025-10-06 08:24:52.131845304 +0000 UTC m=+148.961160518" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.141005 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.143055 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.643037791 +0000 UTC m=+149.472352995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.189729 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jsvj5" podStartSLOduration=129.189706347 podStartE2EDuration="2m9.189706347s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:52.140342394 +0000 UTC m=+148.969657608" watchObservedRunningTime="2025-10-06 08:24:52.189706347 +0000 UTC m=+149.019021551" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.214968 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4fl9" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.243045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.243628 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.743606773 +0000 UTC m=+149.572921987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.266972 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" podStartSLOduration=129.266923659 podStartE2EDuration="2m9.266923659s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:52.191688685 +0000 UTC m=+149.021003899" watchObservedRunningTime="2025-10-06 08:24:52.266923659 +0000 UTC m=+149.096238873" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.344541 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.345002 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.844952243 +0000 UTC m=+149.674267457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.345869 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.346320 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.846305556 +0000 UTC m=+149.675620770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.447639 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.448103 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:52.948078118 +0000 UTC m=+149.777393342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.550031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.550500 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.050480795 +0000 UTC m=+149.879796019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.562362 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-njfmq" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.652078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.652480 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.152464652 +0000 UTC m=+149.981779866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.666420 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.709142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.754721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.755148 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.255129975 +0000 UTC m=+150.084445189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.856854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.857328 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.357288806 +0000 UTC m=+150.186604020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.915189 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:52 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:52 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:52 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.915267 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.936799 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.937951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.948474 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.952169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:24:52 crc kubenswrapper[4755]: I1006 08:24:52.959443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:52 crc kubenswrapper[4755]: E1006 08:24:52.959809 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.459794455 +0000 UTC m=+150.289109669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.012097 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.034235 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060147 4755 generic.go:334] "Generic (PLEG): container finished" podID="c15c418e-734c-43df-b3e2-20619f626df3" containerID="0c87ae79e9b83043a5b0307bf5688d48bba48ad335a04dbf92126cf54b4deb80" exitCode=0 Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060506 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" event={"ID":"c15c418e-734c-43df-b3e2-20619f626df3","Type":"ContainerDied","Data":"0c87ae79e9b83043a5b0307bf5688d48bba48ad335a04dbf92126cf54b4deb80"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060817 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlrhm\" (UniqueName: \"kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.060916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.061103 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.561082754 +0000 UTC m=+150.390397958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: W1006 08:24:53.074708 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod560eb86d_1a29_4eaf_b992_8fa7df3d492c.slice/crio-25225c801858c982c86ed7b342bfd23f081345d3966b58da32382fc703e0f581 WatchSource:0}: Error finding container 25225c801858c982c86ed7b342bfd23f081345d3966b58da32382fc703e0f581: Status 404 returned error can't find the container with id 25225c801858c982c86ed7b342bfd23f081345d3966b58da32382fc703e0f581 Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.078098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerStarted","Data":"5926c3cc4191dc47066b0e4f528cfe7a6c04982a628aed3f3cd79e0541faefb7"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.084308 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a0aa9a33d5a45aab5c849af231a72ff9294d6644bc01a8d3701b4aebd701a7c5"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.084361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3982ae624ff66c36df6a9f76f573f8fd3229e636deef5ac34f550fbfe49a7131"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.099445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerStarted","Data":"efc76c84d807f7011156d15453669b8ca550b4efe83fe146720e0c5db4f479b0"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.101272 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5c496ecb4c3e700a34d6ecfddeec2c72e711cbd4b0ecbb327c0281e016c5a9f3"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.101309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b8c4a2b94de6ad48b8cd87d9228da9e0f8883b7c17fde77026e9a2f65d4bcdc5"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.102073 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.113972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerStarted","Data":"87360e739176757f1d10b66627cbdfb11031ea0727f89dcc2a9f65593eb1daf7"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.137256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a950c08547c1610cbb1fe972437867894ce1e06b495357063db81346a6f031a1"} Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.168673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.168787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlrhm\" (UniqueName: \"kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.168957 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.169058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.174002 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.178057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.178846 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.678822242 +0000 UTC m=+150.508137456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.187255 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.233629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlrhm\" (UniqueName: \"kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm\") pod \"redhat-marketplace-w7dnk\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.286937 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.290414 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.790396495 +0000 UTC m=+150.619711699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.290810 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.325116 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.336787 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.357627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.390431 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.391048 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tffb\" (UniqueName: \"kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.391097 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.391140 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.391514 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:53.891496399 +0000 UTC m=+150.720811613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.502630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.502856 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.002817818 +0000 UTC m=+150.832133032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.502963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.503050 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.503092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tffb\" (UniqueName: \"kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.503145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.504059 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.004039048 +0000 UTC m=+150.833354262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.509259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.509297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.535710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tffb\" (UniqueName: \"kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb\") pod \"redhat-marketplace-v6ww9\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.580550 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.604975 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.605232 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.105186214 +0000 UTC m=+150.934501428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.605263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.605820 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.105776698 +0000 UTC m=+150.935091912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.648029 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:24:53 crc kubenswrapper[4755]: W1006 08:24:53.674228 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ecb1af_d269_4f5a_84ec_026b74882414.slice/crio-18746b12072333b6fd8444dc46005e4f41635744aca81c2850368408fb5d1ae4 WatchSource:0}: Error finding container 18746b12072333b6fd8444dc46005e4f41635744aca81c2850368408fb5d1ae4: Status 404 returned error can't find the container with id 18746b12072333b6fd8444dc46005e4f41635744aca81c2850368408fb5d1ae4 Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.706742 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.708105 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.208080663 +0000 UTC m=+151.037395877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.808805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.809347 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.309321641 +0000 UTC m=+151.138636865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.905797 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:24:53 crc kubenswrapper[4755]: W1006 08:24:53.909344 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3d30f0_54da_4d3e_add8_faa0c3eeea1a.slice/crio-cf8689c0c1107da37dbae0864642533ec5ae030aacaabb2c949576829cb4670e WatchSource:0}: Error finding container cf8689c0c1107da37dbae0864642533ec5ae030aacaabb2c949576829cb4670e: Status 404 returned error can't find the container with id cf8689c0c1107da37dbae0864642533ec5ae030aacaabb2c949576829cb4670e Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.909788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:53 crc kubenswrapper[4755]: E1006 08:24:53.910154 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.410125579 +0000 UTC m=+151.239440793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.913427 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:53 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:53 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:53 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.913462 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.919034 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.920181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.925691 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.929516 4755 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 08:24:53 crc kubenswrapper[4755]: I1006 08:24:53.934857 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.012702 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.012761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktmr\" (UniqueName: \"kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.012856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.012883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: E1006 08:24:54.013313 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.513296985 +0000 UTC m=+151.342612199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.091643 4755 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T08:24:53.9295466Z","Handler":null,"Name":""} Oct 06 08:24:54 crc kubenswrapper[4755]: E1006 08:24:54.115326 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.61526278 +0000 UTC m=+151.444578114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.115094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.115986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.116045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: E1006 08:24:54.116653 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:24:54.616640734 +0000 UTC m=+151.445955948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6zp7" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.117360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.118526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.119288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.120097 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktmr\" (UniqueName: \"kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.127901 4755 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.127970 4755 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.146928 4755 generic.go:334] "Generic (PLEG): container finished" podID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerID="37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7" exitCode=0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.147014 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerDied","Data":"37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.147057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerStarted","Data":"25225c801858c982c86ed7b342bfd23f081345d3966b58da32382fc703e0f581"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.149672 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.159997 4755 generic.go:334] "Generic (PLEG): container finished" podID="79294028-a667-4a44-bf46-a7597f221243" containerID="0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b" exitCode=0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.160152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerDied","Data":"0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.169599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktmr\" (UniqueName: \"kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr\") pod \"redhat-operators-4zm7s\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.183758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerStarted","Data":"cf8689c0c1107da37dbae0864642533ec5ae030aacaabb2c949576829cb4670e"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.208190 4755 generic.go:334] "Generic (PLEG): container finished" podID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerID="391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714" exitCode=0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.208333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerDied","Data":"391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.224036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" event={"ID":"e47b738a-2656-4f75-8ce7-da45f4e17424","Type":"ContainerStarted","Data":"91c7ad76975e228239a3883ded0ca1101231e3f93bc2393a68c354d422805ddf"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.224058 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.234239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.235944 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0a080e19f2f47b2cc8a9ed360286aa72ec61509402eb2c7ea179fda765966608"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.248960 4755 generic.go:334] "Generic (PLEG): container finished" podID="62ecb1af-d269-4f5a-84ec-026b74882414" containerID="d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61" exitCode=0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.249321 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerDied","Data":"d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.249419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerStarted","Data":"18746b12072333b6fd8444dc46005e4f41635744aca81c2850368408fb5d1ae4"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.256240 4755 generic.go:334] "Generic (PLEG): container finished" podID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerID="c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db" exitCode=0 Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.258329 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerDied","Data":"c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db"} Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.268291 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gvsjx" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.321129 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.322769 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.326849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.350839 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.350882 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.367250 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.382971 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.431499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.431619 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xfqp\" (UniqueName: \"kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.431645 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.449334 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6zp7\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.517782 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.532641 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xfqp\" (UniqueName: \"kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.532718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.532762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.535421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.535421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.566183 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xfqp\" (UniqueName: \"kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp\") pod \"redhat-operators-6v7p8\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.633654 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume\") pod \"c15c418e-734c-43df-b3e2-20619f626df3\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.633718 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume\") pod \"c15c418e-734c-43df-b3e2-20619f626df3\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.633860 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbd5\" (UniqueName: \"kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5\") pod \"c15c418e-734c-43df-b3e2-20619f626df3\" (UID: \"c15c418e-734c-43df-b3e2-20619f626df3\") " Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.634992 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume" (OuterVolumeSpecName: "config-volume") pod "c15c418e-734c-43df-b3e2-20619f626df3" (UID: "c15c418e-734c-43df-b3e2-20619f626df3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.638468 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c15c418e-734c-43df-b3e2-20619f626df3" (UID: "c15c418e-734c-43df-b3e2-20619f626df3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.638867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5" (OuterVolumeSpecName: "kube-api-access-4qbd5") pod "c15c418e-734c-43df-b3e2-20619f626df3" (UID: "c15c418e-734c-43df-b3e2-20619f626df3"). InnerVolumeSpecName "kube-api-access-4qbd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.644279 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.650392 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hztlt" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.668411 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.702168 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.735130 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.735884 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15c418e-734c-43df-b3e2-20619f626df3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.735930 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15c418e-734c-43df-b3e2-20619f626df3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.735950 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qbd5\" (UniqueName: \"kubernetes.io/projected/c15c418e-734c-43df-b3e2-20619f626df3-kube-api-access-4qbd5\") on node \"crc\" DevicePath \"\"" Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.911061 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:54 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:54 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:54 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:54 crc kubenswrapper[4755]: I1006 08:24:54.911470 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.012462 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:24:55 crc kubenswrapper[4755]: W1006 08:24:55.060207 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b269dcd_1ae6_4d95_b56d_b72b7ad9eaa1.slice/crio-02cbb7db5386fc4c067277529f8328d532b096322e4630fd5e2885a0477fe143 WatchSource:0}: Error finding container 02cbb7db5386fc4c067277529f8328d532b096322e4630fd5e2885a0477fe143: Status 404 returned error can't find the container with id 02cbb7db5386fc4c067277529f8328d532b096322e4630fd5e2885a0477fe143 Oct 06 08:24:55 crc kubenswrapper[4755]: W1006 08:24:55.070711 4755 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc654c8_49a2_4815_b0c6_edfc8ac3d836.slice/crio-ffa44547305ccdad2c7bca4e71c455bface3e8dd80791cb86dd358eb06ee6ef4/cpu.weight": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc654c8_49a2_4815_b0c6_edfc8ac3d836.slice/crio-ffa44547305ccdad2c7bca4e71c455bface3e8dd80791cb86dd358eb06ee6ef4/cpu.weight: no such device Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.106994 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.107076 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.108712 4755 patch_prober.go:28] interesting pod/console-f9d7485db-nrx4l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.108808 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nrx4l" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 06 08:24:55 crc kubenswrapper[4755]: E1006 08:24:55.142481 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc654c8_49a2_4815_b0c6_edfc8ac3d836.slice/crio-4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.277318 4755 generic.go:334] "Generic (PLEG): container finished" podID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerID="4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13" exitCode=0 Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.277402 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerDied","Data":"4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.277436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerStarted","Data":"ffa44547305ccdad2c7bca4e71c455bface3e8dd80791cb86dd358eb06ee6ef4"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.282918 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerID="1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24" exitCode=0 Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.283017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerDied","Data":"1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.306366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" event={"ID":"e47b738a-2656-4f75-8ce7-da45f4e17424","Type":"ContainerStarted","Data":"cc2e9e469cc9a02209ba78250581b8faebd408fd05da5c60b2fbbaf8f6842524"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.306425 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" event={"ID":"e47b738a-2656-4f75-8ce7-da45f4e17424","Type":"ContainerStarted","Data":"f010c1bce4e92974dc1679d747a2da23af40cb08f37999a7ca857b4eede27185"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.312280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" event={"ID":"c15c418e-734c-43df-b3e2-20619f626df3","Type":"ContainerDied","Data":"2cf4dbde3203b3f58f2c028fb4eb4fad84516b538c4525eecfe731f22e2955e3"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.312339 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf4dbde3203b3f58f2c028fb4eb4fad84516b538c4525eecfe731f22e2955e3" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.312340 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.318588 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerID="160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53" exitCode=0 Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.320315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerDied","Data":"160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.321136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerStarted","Data":"02cbb7db5386fc4c067277529f8328d532b096322e4630fd5e2885a0477fe143"} Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.334855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.387058 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hj99z" podStartSLOduration=13.387030487 podStartE2EDuration="13.387030487s" podCreationTimestamp="2025-10-06 08:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:55.355918766 +0000 UTC m=+152.185233980" watchObservedRunningTime="2025-10-06 08:24:55.387030487 +0000 UTC m=+152.216345701" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.747402 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.747974 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.747446 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.748128 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.890969 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.904864 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.908598 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:55 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:55 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:55 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.908649 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.930283 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:24:55 crc kubenswrapper[4755]: E1006 08:24:55.930679 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15c418e-734c-43df-b3e2-20619f626df3" containerName="collect-profiles" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.930703 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15c418e-734c-43df-b3e2-20619f626df3" containerName="collect-profiles" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.930865 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15c418e-734c-43df-b3e2-20619f626df3" containerName="collect-profiles" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.931408 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.935745 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.956022 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 08:24:55 crc kubenswrapper[4755]: I1006 08:24:55.956237 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.063813 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.063896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.165661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.165732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.165868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.199013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.289725 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.351846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" event={"ID":"bb3290ed-89c6-4367-a39c-0c8fc61a3f88","Type":"ContainerStarted","Data":"a5e89d04175521116eb204278e2afdc27da65997071619922d124f10a0bb5ed6"} Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.351981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" event={"ID":"bb3290ed-89c6-4367-a39c-0c8fc61a3f88","Type":"ContainerStarted","Data":"687a6f37132ebf91fae0693c0995f57718e86344e643f5c3f64061382a74ad0e"} Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.352251 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.383490 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" podStartSLOduration=133.383459672 podStartE2EDuration="2m13.383459672s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:56.378336986 +0000 UTC m=+153.207652200" watchObservedRunningTime="2025-10-06 08:24:56.383459672 +0000 UTC m=+153.212774886" Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.836330 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.910198 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:56 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Oct 06 08:24:56 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:56 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:56 crc kubenswrapper[4755]: I1006 08:24:56.910303 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.029907 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.032229 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.037275 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.037741 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.038966 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.084955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.085971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.187311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.187411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.187531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.216659 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.382493 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9400682a-9230-42e8-95eb-2651c4ebdb5d","Type":"ContainerStarted","Data":"c3b679bbe10f113a6afddfbd3855f5c05b3cd137c4baaf9c9a6193ca9b350cc3"} Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.412044 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cnt4g" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.440892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.909978 4755 patch_prober.go:28] interesting pod/router-default-5444994796-zbxjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:24:57 crc kubenswrapper[4755]: [+]has-synced ok Oct 06 08:24:57 crc kubenswrapper[4755]: [+]process-running ok Oct 06 08:24:57 crc kubenswrapper[4755]: healthz check failed Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.910507 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbxjs" podUID="e14368cf-4d62-407f-b4b4-2318df6a6382" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:24:57 crc kubenswrapper[4755]: I1006 08:24:57.983331 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:24:58 crc kubenswrapper[4755]: I1006 08:24:58.394535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a5479300-81a9-4688-93a4-6a7498b6223f","Type":"ContainerStarted","Data":"5d9427c54e2082e968f0c57109f196673dc41212d2c095759ff2d33cb8006401"} Oct 06 08:24:58 crc kubenswrapper[4755]: I1006 08:24:58.397776 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9400682a-9230-42e8-95eb-2651c4ebdb5d","Type":"ContainerStarted","Data":"b297faaf9e2fc4f8f578b828bb4e862a7f5e06a38dc1fdbbdd3883fb938dc6c4"} Oct 06 08:24:58 crc kubenswrapper[4755]: I1006 08:24:58.914634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:58 crc kubenswrapper[4755]: I1006 08:24:58.918912 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zbxjs" Oct 06 08:24:58 crc kubenswrapper[4755]: I1006 08:24:58.938850 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.9388081980000003 podStartE2EDuration="3.938808198s" podCreationTimestamp="2025-10-06 08:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:58.417915403 +0000 UTC m=+155.247230637" watchObservedRunningTime="2025-10-06 08:24:58.938808198 +0000 UTC m=+155.768123412" Oct 06 08:24:59 crc kubenswrapper[4755]: I1006 08:24:59.470359 4755 generic.go:334] "Generic (PLEG): container finished" podID="9400682a-9230-42e8-95eb-2651c4ebdb5d" containerID="b297faaf9e2fc4f8f578b828bb4e862a7f5e06a38dc1fdbbdd3883fb938dc6c4" exitCode=0 Oct 06 08:24:59 crc kubenswrapper[4755]: I1006 08:24:59.470535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9400682a-9230-42e8-95eb-2651c4ebdb5d","Type":"ContainerDied","Data":"b297faaf9e2fc4f8f578b828bb4e862a7f5e06a38dc1fdbbdd3883fb938dc6c4"} Oct 06 08:24:59 crc kubenswrapper[4755]: I1006 08:24:59.474056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a5479300-81a9-4688-93a4-6a7498b6223f","Type":"ContainerStarted","Data":"72abd0c5b20fcea64103188d9c20db88a1de6094e5a7ab0530521aae2db57077"} Oct 06 08:24:59 crc kubenswrapper[4755]: I1006 08:24:59.507484 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.507455126 podStartE2EDuration="2.507455126s" podCreationTimestamp="2025-10-06 08:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:24:59.506211705 +0000 UTC m=+156.335526939" watchObservedRunningTime="2025-10-06 08:24:59.507455126 +0000 UTC m=+156.336770340" Oct 06 08:25:00 crc kubenswrapper[4755]: I1006 08:25:00.517859 4755 generic.go:334] "Generic (PLEG): container finished" podID="a5479300-81a9-4688-93a4-6a7498b6223f" containerID="72abd0c5b20fcea64103188d9c20db88a1de6094e5a7ab0530521aae2db57077" exitCode=0 Oct 06 08:25:00 crc kubenswrapper[4755]: I1006 08:25:00.518536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a5479300-81a9-4688-93a4-6a7498b6223f","Type":"ContainerDied","Data":"72abd0c5b20fcea64103188d9c20db88a1de6094e5a7ab0530521aae2db57077"} Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.092408 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.284915 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir\") pod \"9400682a-9230-42e8-95eb-2651c4ebdb5d\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.285011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access\") pod \"9400682a-9230-42e8-95eb-2651c4ebdb5d\" (UID: \"9400682a-9230-42e8-95eb-2651c4ebdb5d\") " Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.286090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9400682a-9230-42e8-95eb-2651c4ebdb5d" (UID: "9400682a-9230-42e8-95eb-2651c4ebdb5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.314145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9400682a-9230-42e8-95eb-2651c4ebdb5d" (UID: "9400682a-9230-42e8-95eb-2651c4ebdb5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.387741 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9400682a-9230-42e8-95eb-2651c4ebdb5d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.387795 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400682a-9230-42e8-95eb-2651c4ebdb5d-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.533599 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.537450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9400682a-9230-42e8-95eb-2651c4ebdb5d","Type":"ContainerDied","Data":"c3b679bbe10f113a6afddfbd3855f5c05b3cd137c4baaf9c9a6193ca9b350cc3"} Oct 06 08:25:01 crc kubenswrapper[4755]: I1006 08:25:01.537520 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b679bbe10f113a6afddfbd3855f5c05b3cd137c4baaf9c9a6193ca9b350cc3" Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.127910 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.133406 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.747643 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.748208 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.747640 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-klxzw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 06 08:25:05 crc kubenswrapper[4755]: I1006 08:25:05.748281 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klxzw" podUID="264bea46-510c-4a6c-ba59-91b0388882de" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 06 08:25:06 crc kubenswrapper[4755]: I1006 08:25:06.472870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:25:06 crc kubenswrapper[4755]: I1006 08:25:06.483270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60fbd235-a60f-436e-9552-e3eaf60f24f3-metrics-certs\") pod \"network-metrics-daemon-vf9ht\" (UID: \"60fbd235-a60f-436e-9552-e3eaf60f24f3\") " pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:25:06 crc kubenswrapper[4755]: I1006 08:25:06.737519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vf9ht" Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.813401 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.909159 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access\") pod \"a5479300-81a9-4688-93a4-6a7498b6223f\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.909221 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir\") pod \"a5479300-81a9-4688-93a4-6a7498b6223f\" (UID: \"a5479300-81a9-4688-93a4-6a7498b6223f\") " Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.909391 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a5479300-81a9-4688-93a4-6a7498b6223f" (UID: "a5479300-81a9-4688-93a4-6a7498b6223f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.909537 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5479300-81a9-4688-93a4-6a7498b6223f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:08 crc kubenswrapper[4755]: I1006 08:25:08.914390 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a5479300-81a9-4688-93a4-6a7498b6223f" (UID: "a5479300-81a9-4688-93a4-6a7498b6223f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:09 crc kubenswrapper[4755]: I1006 08:25:09.011435 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5479300-81a9-4688-93a4-6a7498b6223f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:09 crc kubenswrapper[4755]: I1006 08:25:09.619306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a5479300-81a9-4688-93a4-6a7498b6223f","Type":"ContainerDied","Data":"5d9427c54e2082e968f0c57109f196673dc41212d2c095759ff2d33cb8006401"} Oct 06 08:25:09 crc kubenswrapper[4755]: I1006 08:25:09.619879 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9427c54e2082e968f0c57109f196673dc41212d2c095759ff2d33cb8006401" Oct 06 08:25:09 crc kubenswrapper[4755]: I1006 08:25:09.620100 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:25:14 crc kubenswrapper[4755]: I1006 08:25:14.710865 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:25:15 crc kubenswrapper[4755]: I1006 08:25:15.760050 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-klxzw" Oct 06 08:25:18 crc kubenswrapper[4755]: I1006 08:25:18.912049 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:25:18 crc kubenswrapper[4755]: I1006 08:25:18.912859 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.771396 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.773992 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lpvng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wnmvf_openshift-marketplace(ccfb4e16-5c5f-4724-b694-02443086a6a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.775484 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wnmvf" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.783266 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.783951 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k6nm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dzsrh_openshift-marketplace(b94e8d7e-d807-4809-ac0e-a219363e15d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.785052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dzsrh" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.837790 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.838903 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lvlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m2hkf_openshift-marketplace(560eb86d-1a29-4eaf-b992-8fa7df3d492c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:25:24 crc kubenswrapper[4755]: E1006 08:25:24.840130 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m2hkf" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" Oct 06 08:25:25 crc kubenswrapper[4755]: I1006 08:25:25.112347 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vf9ht"] Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.640668 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.640924 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tffb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v6ww9_openshift-marketplace(3a3d30f0-54da-4d3e-add8-faa0c3eeea1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.642453 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v6ww9" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.649732 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.650238 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlrhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-w7dnk_openshift-marketplace(62ecb1af-d269-4f5a-84ec-026b74882414): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.652180 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-w7dnk" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" Oct 06 08:25:25 crc kubenswrapper[4755]: I1006 08:25:25.721098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" event={"ID":"60fbd235-a60f-436e-9552-e3eaf60f24f3","Type":"ContainerStarted","Data":"b4f44ac9e13421f6550d8ef3dc3b56c498967ec1559fe2c1296e5cab03497f71"} Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.724876 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m2hkf" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.724893 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wnmvf" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.724993 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v6ww9" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.725062 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-w7dnk" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" Oct 06 08:25:25 crc kubenswrapper[4755]: E1006 08:25:25.725165 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dzsrh" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" Oct 06 08:25:25 crc kubenswrapper[4755]: I1006 08:25:25.910830 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lzq7b" Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.735777 4755 generic.go:334] "Generic (PLEG): container finished" podID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerID="54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17" exitCode=0 Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.736017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerDied","Data":"54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17"} Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.741189 4755 generic.go:334] "Generic (PLEG): container finished" podID="79294028-a667-4a44-bf46-a7597f221243" containerID="655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e" exitCode=0 Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.741312 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerDied","Data":"655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e"} Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.750687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" event={"ID":"60fbd235-a60f-436e-9552-e3eaf60f24f3","Type":"ContainerStarted","Data":"354fa86a1ffaf1d724d0cef13c8d8c723425eba866fe376e49b3127376488608"} Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.750760 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vf9ht" event={"ID":"60fbd235-a60f-436e-9552-e3eaf60f24f3","Type":"ContainerStarted","Data":"833b709f39022871c8858f8475d42da6e2848457f4d948232715e753b313a82d"} Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.769216 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerID="5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830" exitCode=0 Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.769279 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerDied","Data":"5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830"} Oct 06 08:25:26 crc kubenswrapper[4755]: I1006 08:25:26.810394 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vf9ht" podStartSLOduration=163.810357365 podStartE2EDuration="2m43.810357365s" podCreationTimestamp="2025-10-06 08:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:25:26.798292225 +0000 UTC m=+183.627607449" watchObservedRunningTime="2025-10-06 08:25:26.810357365 +0000 UTC m=+183.639672589" Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.790270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerStarted","Data":"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84"} Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.794528 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerStarted","Data":"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5"} Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.798162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerStarted","Data":"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2"} Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.818441 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4zm7s" podStartSLOduration=2.824796641 podStartE2EDuration="36.818420675s" podCreationTimestamp="2025-10-06 08:24:53 +0000 UTC" firstStartedPulling="2025-10-06 08:24:55.278870747 +0000 UTC m=+152.108185961" lastFinishedPulling="2025-10-06 08:25:29.272494781 +0000 UTC m=+186.101809995" observedRunningTime="2025-10-06 08:25:29.815984355 +0000 UTC m=+186.645299579" watchObservedRunningTime="2025-10-06 08:25:29.818420675 +0000 UTC m=+186.647735889" Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.837861 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9jnxh" podStartSLOduration=5.447982136 podStartE2EDuration="39.837829436s" podCreationTimestamp="2025-10-06 08:24:50 +0000 UTC" firstStartedPulling="2025-10-06 08:24:54.162608063 +0000 UTC m=+150.991923277" lastFinishedPulling="2025-10-06 08:25:28.552455353 +0000 UTC m=+185.381770577" observedRunningTime="2025-10-06 08:25:29.835901048 +0000 UTC m=+186.665216262" watchObservedRunningTime="2025-10-06 08:25:29.837829436 +0000 UTC m=+186.667144670" Oct 06 08:25:29 crc kubenswrapper[4755]: I1006 08:25:29.875619 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6v7p8" podStartSLOduration=1.9032605070000002 podStartE2EDuration="35.875597922s" podCreationTimestamp="2025-10-06 08:24:54 +0000 UTC" firstStartedPulling="2025-10-06 08:24:55.32819495 +0000 UTC m=+152.157510164" lastFinishedPulling="2025-10-06 08:25:29.300532365 +0000 UTC m=+186.129847579" observedRunningTime="2025-10-06 08:25:29.870858474 +0000 UTC m=+186.700173708" watchObservedRunningTime="2025-10-06 08:25:29.875597922 +0000 UTC m=+186.704913136" Oct 06 08:25:31 crc kubenswrapper[4755]: I1006 08:25:31.100834 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:25:31 crc kubenswrapper[4755]: I1006 08:25:31.101262 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:25:31 crc kubenswrapper[4755]: I1006 08:25:31.129411 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:25:31 crc kubenswrapper[4755]: I1006 08:25:31.282102 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.384844 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.385647 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.463113 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.669628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.669688 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.754346 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.874494 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:25:34 crc kubenswrapper[4755]: I1006 08:25:34.876933 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:35 crc kubenswrapper[4755]: I1006 08:25:35.952917 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:25:36 crc kubenswrapper[4755]: I1006 08:25:36.843689 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6v7p8" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="registry-server" containerID="cri-o://2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2" gracePeriod=2 Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.640667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.799964 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xfqp\" (UniqueName: \"kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp\") pod \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.800117 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content\") pod \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.800209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities\") pod \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\" (UID: \"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1\") " Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.802097 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities" (OuterVolumeSpecName: "utilities") pod "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" (UID: "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.806687 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp" (OuterVolumeSpecName: "kube-api-access-4xfqp") pod "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" (UID: "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1"). InnerVolumeSpecName "kube-api-access-4xfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.859395 4755 generic.go:334] "Generic (PLEG): container finished" podID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerID="5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62" exitCode=0 Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.859537 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerDied","Data":"5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62"} Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.865139 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerID="2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2" exitCode=0 Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.865200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerDied","Data":"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2"} Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.865241 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v7p8" event={"ID":"5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1","Type":"ContainerDied","Data":"02cbb7db5386fc4c067277529f8328d532b096322e4630fd5e2885a0477fe143"} Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.865252 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v7p8" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.865271 4755 scope.go:117] "RemoveContainer" containerID="2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.901721 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xfqp\" (UniqueName: \"kubernetes.io/projected/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-kube-api-access-4xfqp\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.901758 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.910996 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" (UID: "5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.911199 4755 scope.go:117] "RemoveContainer" containerID="5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830" Oct 06 08:25:37 crc kubenswrapper[4755]: I1006 08:25:37.945806 4755 scope.go:117] "RemoveContainer" containerID="160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.003546 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.007909 4755 scope.go:117] "RemoveContainer" containerID="2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2" Oct 06 08:25:38 crc kubenswrapper[4755]: E1006 08:25:38.008525 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2\": container with ID starting with 2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2 not found: ID does not exist" containerID="2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.008593 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2"} err="failed to get container status \"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2\": rpc error: code = NotFound desc = could not find container \"2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2\": container with ID starting with 2f05b4a4c995a9cbee7236edb8c84b87087efbdb58e7948a7f4d98cfd2c2e2b2 not found: ID does not exist" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.008659 4755 scope.go:117] "RemoveContainer" containerID="5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830" Oct 06 08:25:38 crc kubenswrapper[4755]: E1006 08:25:38.009268 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830\": container with ID starting with 5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830 not found: ID does not exist" containerID="5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.009318 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830"} err="failed to get container status \"5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830\": rpc error: code = NotFound desc = could not find container \"5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830\": container with ID starting with 5541d365aca46b4848cc213a28313e598961e590850f3fec680127c9b5aec830 not found: ID does not exist" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.009361 4755 scope.go:117] "RemoveContainer" containerID="160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53" Oct 06 08:25:38 crc kubenswrapper[4755]: E1006 08:25:38.009982 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53\": container with ID starting with 160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53 not found: ID does not exist" containerID="160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.010015 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53"} err="failed to get container status \"160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53\": rpc error: code = NotFound desc = could not find container \"160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53\": container with ID starting with 160a2846e79ce0a1703b0040452e71b6addb020c42a1880d6c73c47a7f1b2d53 not found: ID does not exist" Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.202546 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.210312 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6v7p8"] Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.874658 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerStarted","Data":"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c"} Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.876787 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerID="97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f" exitCode=0 Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.876871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerDied","Data":"97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f"} Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.881517 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerStarted","Data":"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614"} Oct 06 08:25:38 crc kubenswrapper[4755]: I1006 08:25:38.943690 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dzsrh" podStartSLOduration=4.882188369 podStartE2EDuration="48.943670362s" podCreationTimestamp="2025-10-06 08:24:50 +0000 UTC" firstStartedPulling="2025-10-06 08:24:54.211880884 +0000 UTC m=+151.041196098" lastFinishedPulling="2025-10-06 08:25:38.273362877 +0000 UTC m=+195.102678091" observedRunningTime="2025-10-06 08:25:38.939657484 +0000 UTC m=+195.768972698" watchObservedRunningTime="2025-10-06 08:25:38.943670362 +0000 UTC m=+195.772985576" Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.886591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" path="/var/lib/kubelet/pods/5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1/volumes" Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.892994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerStarted","Data":"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7"} Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.898119 4755 generic.go:334] "Generic (PLEG): container finished" podID="62ecb1af-d269-4f5a-84ec-026b74882414" containerID="9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753" exitCode=0 Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.898217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerDied","Data":"9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753"} Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.900926 4755 generic.go:334] "Generic (PLEG): container finished" podID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerID="b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c" exitCode=0 Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.900965 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerDied","Data":"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c"} Oct 06 08:25:39 crc kubenswrapper[4755]: I1006 08:25:39.953016 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6ww9" podStartSLOduration=2.967180649 podStartE2EDuration="46.952991747s" podCreationTimestamp="2025-10-06 08:24:53 +0000 UTC" firstStartedPulling="2025-10-06 08:24:55.286807784 +0000 UTC m=+152.116122998" lastFinishedPulling="2025-10-06 08:25:39.272618882 +0000 UTC m=+196.101934096" observedRunningTime="2025-10-06 08:25:39.948823134 +0000 UTC m=+196.778138348" watchObservedRunningTime="2025-10-06 08:25:39.952991747 +0000 UTC m=+196.782306961" Oct 06 08:25:40 crc kubenswrapper[4755]: I1006 08:25:40.910227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerStarted","Data":"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98"} Oct 06 08:25:40 crc kubenswrapper[4755]: I1006 08:25:40.912149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerStarted","Data":"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4"} Oct 06 08:25:40 crc kubenswrapper[4755]: I1006 08:25:40.913778 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerStarted","Data":"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d"} Oct 06 08:25:40 crc kubenswrapper[4755]: I1006 08:25:40.932709 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m2hkf" podStartSLOduration=3.709226256 podStartE2EDuration="49.932687642s" podCreationTimestamp="2025-10-06 08:24:51 +0000 UTC" firstStartedPulling="2025-10-06 08:24:54.149234062 +0000 UTC m=+150.978549276" lastFinishedPulling="2025-10-06 08:25:40.372695448 +0000 UTC m=+197.202010662" observedRunningTime="2025-10-06 08:25:40.93106129 +0000 UTC m=+197.760376494" watchObservedRunningTime="2025-10-06 08:25:40.932687642 +0000 UTC m=+197.762002856" Oct 06 08:25:40 crc kubenswrapper[4755]: I1006 08:25:40.953744 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7dnk" podStartSLOduration=2.7633974869999998 podStartE2EDuration="48.953724058s" podCreationTimestamp="2025-10-06 08:24:52 +0000 UTC" firstStartedPulling="2025-10-06 08:24:54.25130916 +0000 UTC m=+151.080624374" lastFinishedPulling="2025-10-06 08:25:40.441635731 +0000 UTC m=+197.270950945" observedRunningTime="2025-10-06 08:25:40.949722165 +0000 UTC m=+197.779037379" watchObservedRunningTime="2025-10-06 08:25:40.953724058 +0000 UTC m=+197.783039272" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.146148 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.366601 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.366663 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.408073 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.547055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.547114 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.921309 4755 generic.go:334] "Generic (PLEG): container finished" podID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerID="05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d" exitCode=0 Oct 06 08:25:41 crc kubenswrapper[4755]: I1006 08:25:41.921518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerDied","Data":"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d"} Oct 06 08:25:42 crc kubenswrapper[4755]: I1006 08:25:42.585896 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m2hkf" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="registry-server" probeResult="failure" output=< Oct 06 08:25:42 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Oct 06 08:25:42 crc kubenswrapper[4755]: > Oct 06 08:25:42 crc kubenswrapper[4755]: I1006 08:25:42.939322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerStarted","Data":"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88"} Oct 06 08:25:42 crc kubenswrapper[4755]: I1006 08:25:42.960732 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wnmvf" podStartSLOduration=3.794436723 podStartE2EDuration="51.960701579s" podCreationTimestamp="2025-10-06 08:24:51 +0000 UTC" firstStartedPulling="2025-10-06 08:24:54.259321089 +0000 UTC m=+151.088636303" lastFinishedPulling="2025-10-06 08:25:42.425585945 +0000 UTC m=+199.254901159" observedRunningTime="2025-10-06 08:25:42.960453917 +0000 UTC m=+199.789769161" watchObservedRunningTime="2025-10-06 08:25:42.960701579 +0000 UTC m=+199.790016793" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.292023 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.292114 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.341671 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.581974 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.582055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:43 crc kubenswrapper[4755]: I1006 08:25:43.622421 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:44 crc kubenswrapper[4755]: I1006 08:25:44.309883 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:25:48 crc kubenswrapper[4755]: I1006 08:25:48.912368 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:25:48 crc kubenswrapper[4755]: I1006 08:25:48.913044 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:25:48 crc kubenswrapper[4755]: I1006 08:25:48.913108 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:25:48 crc kubenswrapper[4755]: I1006 08:25:48.913917 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:25:48 crc kubenswrapper[4755]: I1006 08:25:48.914071 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2" gracePeriod=600 Oct 06 08:25:49 crc kubenswrapper[4755]: I1006 08:25:49.991473 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2" exitCode=0 Oct 06 08:25:49 crc kubenswrapper[4755]: I1006 08:25:49.991582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2"} Oct 06 08:25:49 crc kubenswrapper[4755]: I1006 08:25:49.991937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18"} Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.410002 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.584998 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.624757 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.657662 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.657734 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:51 crc kubenswrapper[4755]: I1006 08:25:51.697956 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:52 crc kubenswrapper[4755]: I1006 08:25:52.043496 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:52 crc kubenswrapper[4755]: I1006 08:25:52.553764 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:25:53 crc kubenswrapper[4755]: I1006 08:25:53.339305 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:25:53 crc kubenswrapper[4755]: I1006 08:25:53.640206 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:53 crc kubenswrapper[4755]: I1006 08:25:53.951599 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:25:53 crc kubenswrapper[4755]: I1006 08:25:53.951866 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m2hkf" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="registry-server" containerID="cri-o://6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98" gracePeriod=2 Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.013543 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wnmvf" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="registry-server" containerID="cri-o://ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88" gracePeriod=2 Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.298549 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.407261 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.461005 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvlw\" (UniqueName: \"kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw\") pod \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.461082 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content\") pod \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.461207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities\") pod \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\" (UID: \"560eb86d-1a29-4eaf-b992-8fa7df3d492c\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.462681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities" (OuterVolumeSpecName: "utilities") pod "560eb86d-1a29-4eaf-b992-8fa7df3d492c" (UID: "560eb86d-1a29-4eaf-b992-8fa7df3d492c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.468553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw" (OuterVolumeSpecName: "kube-api-access-5lvlw") pod "560eb86d-1a29-4eaf-b992-8fa7df3d492c" (UID: "560eb86d-1a29-4eaf-b992-8fa7df3d492c"). InnerVolumeSpecName "kube-api-access-5lvlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.507780 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "560eb86d-1a29-4eaf-b992-8fa7df3d492c" (UID: "560eb86d-1a29-4eaf-b992-8fa7df3d492c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.562817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities\") pod \"ccfb4e16-5c5f-4724-b694-02443086a6a1\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.562932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpvng\" (UniqueName: \"kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng\") pod \"ccfb4e16-5c5f-4724-b694-02443086a6a1\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.562975 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content\") pod \"ccfb4e16-5c5f-4724-b694-02443086a6a1\" (UID: \"ccfb4e16-5c5f-4724-b694-02443086a6a1\") " Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.563351 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.563373 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvlw\" (UniqueName: \"kubernetes.io/projected/560eb86d-1a29-4eaf-b992-8fa7df3d492c-kube-api-access-5lvlw\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.563382 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560eb86d-1a29-4eaf-b992-8fa7df3d492c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.564126 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities" (OuterVolumeSpecName: "utilities") pod "ccfb4e16-5c5f-4724-b694-02443086a6a1" (UID: "ccfb4e16-5c5f-4724-b694-02443086a6a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.565994 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng" (OuterVolumeSpecName: "kube-api-access-lpvng") pod "ccfb4e16-5c5f-4724-b694-02443086a6a1" (UID: "ccfb4e16-5c5f-4724-b694-02443086a6a1"). InnerVolumeSpecName "kube-api-access-lpvng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.609755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccfb4e16-5c5f-4724-b694-02443086a6a1" (UID: "ccfb4e16-5c5f-4724-b694-02443086a6a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.664179 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.664204 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpvng\" (UniqueName: \"kubernetes.io/projected/ccfb4e16-5c5f-4724-b694-02443086a6a1-kube-api-access-lpvng\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:54 crc kubenswrapper[4755]: I1006 08:25:54.664213 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfb4e16-5c5f-4724-b694-02443086a6a1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.021967 4755 generic.go:334] "Generic (PLEG): container finished" podID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerID="ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88" exitCode=0 Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.022026 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerDied","Data":"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88"} Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.022072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnmvf" event={"ID":"ccfb4e16-5c5f-4724-b694-02443086a6a1","Type":"ContainerDied","Data":"5926c3cc4191dc47066b0e4f528cfe7a6c04982a628aed3f3cd79e0541faefb7"} Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.022070 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnmvf" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.022121 4755 scope.go:117] "RemoveContainer" containerID="ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.025218 4755 generic.go:334] "Generic (PLEG): container finished" podID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerID="6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98" exitCode=0 Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.025278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerDied","Data":"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98"} Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.025320 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2hkf" event={"ID":"560eb86d-1a29-4eaf-b992-8fa7df3d492c","Type":"ContainerDied","Data":"25225c801858c982c86ed7b342bfd23f081345d3966b58da32382fc703e0f581"} Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.025342 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2hkf" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.041366 4755 scope.go:117] "RemoveContainer" containerID="05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.059339 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.066763 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wnmvf"] Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.071413 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.074941 4755 scope.go:117] "RemoveContainer" containerID="c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.076765 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m2hkf"] Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.096489 4755 scope.go:117] "RemoveContainer" containerID="ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.097082 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88\": container with ID starting with ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88 not found: ID does not exist" containerID="ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097119 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88"} err="failed to get container status \"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88\": rpc error: code = NotFound desc = could not find container \"ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88\": container with ID starting with ea880bb2129b065fda266d38d29afc83dd92b26302da6c15f19b8e246eaecd88 not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097148 4755 scope.go:117] "RemoveContainer" containerID="05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.097410 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d\": container with ID starting with 05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d not found: ID does not exist" containerID="05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097435 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d"} err="failed to get container status \"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d\": rpc error: code = NotFound desc = could not find container \"05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d\": container with ID starting with 05749b8bcc078863564e2c7a165ac104cc78f7938562913b7e07e151e5837f8d not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097449 4755 scope.go:117] "RemoveContainer" containerID="c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.097832 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db\": container with ID starting with c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db not found: ID does not exist" containerID="c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097855 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db"} err="failed to get container status \"c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db\": rpc error: code = NotFound desc = could not find container \"c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db\": container with ID starting with c0ef87dfc5ec7eb4f21bf8007ee6dc5088351acbb8c45efd2f3e11c4e7dc45db not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.097870 4755 scope.go:117] "RemoveContainer" containerID="6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.111792 4755 scope.go:117] "RemoveContainer" containerID="b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.141270 4755 scope.go:117] "RemoveContainer" containerID="37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.158235 4755 scope.go:117] "RemoveContainer" containerID="6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.158890 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98\": container with ID starting with 6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98 not found: ID does not exist" containerID="6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.158948 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98"} err="failed to get container status \"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98\": rpc error: code = NotFound desc = could not find container \"6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98\": container with ID starting with 6dc38518bc8d6e97bb91711e53f8735a78ec552778c6e83d690d1f0333b0ac98 not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.158996 4755 scope.go:117] "RemoveContainer" containerID="b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.159588 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c\": container with ID starting with b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c not found: ID does not exist" containerID="b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.159616 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c"} err="failed to get container status \"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c\": rpc error: code = NotFound desc = could not find container \"b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c\": container with ID starting with b9a7605b1b695c567cf8c50b8cb35459009db3ba302dffb62defe44ce90fbf2c not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.159634 4755 scope.go:117] "RemoveContainer" containerID="37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7" Oct 06 08:25:55 crc kubenswrapper[4755]: E1006 08:25:55.160352 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7\": container with ID starting with 37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7 not found: ID does not exist" containerID="37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.160384 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7"} err="failed to get container status \"37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7\": rpc error: code = NotFound desc = could not find container \"37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7\": container with ID starting with 37a00d6dff43519e39116638fed3ea8dd50d86dd873ab314b354a54b666cfcb7 not found: ID does not exist" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.888098 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" path="/var/lib/kubelet/pods/560eb86d-1a29-4eaf-b992-8fa7df3d492c/volumes" Oct 06 08:25:55 crc kubenswrapper[4755]: I1006 08:25:55.889337 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" path="/var/lib/kubelet/pods/ccfb4e16-5c5f-4724-b694-02443086a6a1/volumes" Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.354119 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.354415 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6ww9" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="registry-server" containerID="cri-o://aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7" gracePeriod=2 Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.711126 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.909460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities\") pod \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.909513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content\") pod \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.909675 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tffb\" (UniqueName: \"kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb\") pod \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\" (UID: \"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a\") " Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.912055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities" (OuterVolumeSpecName: "utilities") pod "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" (UID: "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.920622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb" (OuterVolumeSpecName: "kube-api-access-5tffb") pod "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" (UID: "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a"). InnerVolumeSpecName "kube-api-access-5tffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:25:56 crc kubenswrapper[4755]: I1006 08:25:56.923672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" (UID: "3a3d30f0-54da-4d3e-add8-faa0c3eeea1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.011407 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.011467 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.011480 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tffb\" (UniqueName: \"kubernetes.io/projected/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a-kube-api-access-5tffb\") on node \"crc\" DevicePath \"\"" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.045676 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerID="aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7" exitCode=0 Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.045782 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6ww9" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.045774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerDied","Data":"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7"} Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.046208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6ww9" event={"ID":"3a3d30f0-54da-4d3e-add8-faa0c3eeea1a","Type":"ContainerDied","Data":"cf8689c0c1107da37dbae0864642533ec5ae030aacaabb2c949576829cb4670e"} Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.046232 4755 scope.go:117] "RemoveContainer" containerID="aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.064054 4755 scope.go:117] "RemoveContainer" containerID="97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.076609 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.079116 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6ww9"] Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.086320 4755 scope.go:117] "RemoveContainer" containerID="1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.101348 4755 scope.go:117] "RemoveContainer" containerID="aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7" Oct 06 08:25:57 crc kubenswrapper[4755]: E1006 08:25:57.102141 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7\": container with ID starting with aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7 not found: ID does not exist" containerID="aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.102244 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7"} err="failed to get container status \"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7\": rpc error: code = NotFound desc = could not find container \"aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7\": container with ID starting with aa05ac41af82346e6bc778d2f705b0ffae503a05a3ae2c46361a6ea96832e3b7 not found: ID does not exist" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.102334 4755 scope.go:117] "RemoveContainer" containerID="97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f" Oct 06 08:25:57 crc kubenswrapper[4755]: E1006 08:25:57.102908 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f\": container with ID starting with 97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f not found: ID does not exist" containerID="97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.102944 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f"} err="failed to get container status \"97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f\": rpc error: code = NotFound desc = could not find container \"97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f\": container with ID starting with 97e6e2a919da2ef8d99b4a30c7cdf961948c9be0ae0bc0a41da036f6e4e5429f not found: ID does not exist" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.102982 4755 scope.go:117] "RemoveContainer" containerID="1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24" Oct 06 08:25:57 crc kubenswrapper[4755]: E1006 08:25:57.103445 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24\": container with ID starting with 1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24 not found: ID does not exist" containerID="1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.103478 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24"} err="failed to get container status \"1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24\": rpc error: code = NotFound desc = could not find container \"1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24\": container with ID starting with 1efd1d30f4af2618d1395fb1717c0547de3dfaa41cb1446756e69836c1ceba24 not found: ID does not exist" Oct 06 08:25:57 crc kubenswrapper[4755]: I1006 08:25:57.893444 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" path="/var/lib/kubelet/pods/3a3d30f0-54da-4d3e-add8-faa0c3eeea1a/volumes" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.352126 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" podUID="92199f0a-b1db-438f-8e44-446e840f07cf" containerName="oauth-openshift" containerID="cri-o://8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892" gracePeriod=15 Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.744287 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.786781 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f4f7798bf-8w6k4"] Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787019 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787034 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787046 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787053 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787062 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787069 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787077 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787083 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787092 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787098 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787107 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787117 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787125 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787132 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787141 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787150 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787159 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787165 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787173 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787179 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="extract-utilities" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787188 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92199f0a-b1db-438f-8e44-446e840f07cf" containerName="oauth-openshift" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787195 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="92199f0a-b1db-438f-8e44-446e840f07cf" containerName="oauth-openshift" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787205 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5479300-81a9-4688-93a4-6a7498b6223f" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787211 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5479300-81a9-4688-93a4-6a7498b6223f" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787218 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787224 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787235 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9400682a-9230-42e8-95eb-2651c4ebdb5d" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787240 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9400682a-9230-42e8-95eb-2651c4ebdb5d" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: E1006 08:26:09.787248 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787254 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="extract-content" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787338 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="560eb86d-1a29-4eaf-b992-8fa7df3d492c" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787348 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfb4e16-5c5f-4724-b694-02443086a6a1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787357 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3d30f0-54da-4d3e-add8-faa0c3eeea1a" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787368 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9400682a-9230-42e8-95eb-2651c4ebdb5d" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787377 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="92199f0a-b1db-438f-8e44-446e840f07cf" containerName="oauth-openshift" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787384 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b269dcd-1ae6-4d95-b56d-b72b7ad9eaa1" containerName="registry-server" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787393 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5479300-81a9-4688-93a4-6a7498b6223f" containerName="pruner" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.787849 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.799710 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f4f7798bf-8w6k4"] Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.895595 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896356 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896621 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896771 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.896971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qhd7\" (UniqueName: \"kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897149 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897234 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897361 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897480 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session\") pod \"92199f0a-b1db-438f-8e44-446e840f07cf\" (UID: \"92199f0a-b1db-438f-8e44-446e840f07cf\") " Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897751 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-policies\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-service-ca\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898003 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.897840 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-router-certs\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898216 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-session\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-dir\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-error\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898429 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-login\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898553 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-kube-api-access-mtk5d\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898659 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898704 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898817 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898914 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.898993 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92199f0a-b1db-438f-8e44-446e840f07cf-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.899217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.899769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.905636 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.905875 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.906144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.906198 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7" (OuterVolumeSpecName: "kube-api-access-9qhd7") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "kube-api-access-9qhd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.906964 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.907430 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.907939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.909076 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:09 crc kubenswrapper[4755]: I1006 08:26:09.912061 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "92199f0a-b1db-438f-8e44-446e840f07cf" (UID: "92199f0a-b1db-438f-8e44-446e840f07cf"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-policies\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-service-ca\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000806 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-router-certs\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-session\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.000989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-dir\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-error\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001062 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-login\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001235 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-kube-api-access-mtk5d\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.001317 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.002324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-policies\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.003268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.003797 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.004058 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-audit-dir\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.005046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-service-ca\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008089 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008156 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008698 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008722 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008745 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008767 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008832 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008891 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008916 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92199f0a-b1db-438f-8e44-446e840f07cf-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.008937 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qhd7\" (UniqueName: \"kubernetes.io/projected/92199f0a-b1db-438f-8e44-446e840f07cf-kube-api-access-9qhd7\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.009918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-login\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.014894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.018351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.018628 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.018967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.021967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-router-certs\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.022486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-user-template-error\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.023957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-session\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.024813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.026861 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/f1e6646a-6edf-4a0c-9ad1-0caeadefbffa-kube-api-access-mtk5d\") pod \"oauth-openshift-f4f7798bf-8w6k4\" (UID: \"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa\") " pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.113503 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.133551 4755 generic.go:334] "Generic (PLEG): container finished" podID="92199f0a-b1db-438f-8e44-446e840f07cf" containerID="8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892" exitCode=0 Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.133660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" event={"ID":"92199f0a-b1db-438f-8e44-446e840f07cf","Type":"ContainerDied","Data":"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892"} Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.133699 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" event={"ID":"92199f0a-b1db-438f-8e44-446e840f07cf","Type":"ContainerDied","Data":"bfd8897102d298c46116f492287ce692f6d78f29b912464992c72c149d1dce46"} Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.133724 4755 scope.go:117] "RemoveContainer" containerID="8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.133893 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p47k9" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.181388 4755 scope.go:117] "RemoveContainer" containerID="8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892" Oct 06 08:26:10 crc kubenswrapper[4755]: E1006 08:26:10.182031 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892\": container with ID starting with 8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892 not found: ID does not exist" containerID="8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.182073 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892"} err="failed to get container status \"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892\": rpc error: code = NotFound desc = could not find container \"8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892\": container with ID starting with 8173d79c148ac8d2967d968284a2c3bccb725aa64c8b54f5aabf16efdddbd892 not found: ID does not exist" Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.187364 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.193783 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p47k9"] Oct 06 08:26:10 crc kubenswrapper[4755]: I1006 08:26:10.369112 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f4f7798bf-8w6k4"] Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.142945 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" event={"ID":"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa","Type":"ContainerStarted","Data":"c6b0840c51f272c7161ca2e522e7c54373514f0ea76240ef67be3083575ca600"} Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.143422 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" event={"ID":"f1e6646a-6edf-4a0c-9ad1-0caeadefbffa","Type":"ContainerStarted","Data":"1579e10baa0961e837704ef3be1d55835758231f33aa442caf5d2014842a7383"} Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.143462 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.150113 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.166297 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f4f7798bf-8w6k4" podStartSLOduration=27.166270102 podStartE2EDuration="27.166270102s" podCreationTimestamp="2025-10-06 08:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:26:11.162634575 +0000 UTC m=+227.991949799" watchObservedRunningTime="2025-10-06 08:26:11.166270102 +0000 UTC m=+227.995585316" Oct 06 08:26:11 crc kubenswrapper[4755]: I1006 08:26:11.891172 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92199f0a-b1db-438f-8e44-446e840f07cf" path="/var/lib/kubelet/pods/92199f0a-b1db-438f-8e44-446e840f07cf/volumes" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.271521 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.274925 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9jnxh" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="registry-server" containerID="cri-o://bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5" gracePeriod=30 Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.294155 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.294892 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dzsrh" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="registry-server" containerID="cri-o://6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614" gracePeriod=30 Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.298231 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.298605 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" containerID="cri-o://c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29" gracePeriod=30 Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.308061 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.308394 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7dnk" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="registry-server" containerID="cri-o://f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4" gracePeriod=30 Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.324768 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.324998 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4zm7s" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="registry-server" containerID="cri-o://6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" gracePeriod=30 Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.335379 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4ld2"] Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.336102 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.349936 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4ld2"] Oct 06 08:26:44 crc kubenswrapper[4755]: E1006 08:26:44.385111 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 is running failed: container process not found" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:26:44 crc kubenswrapper[4755]: E1006 08:26:44.385752 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 is running failed: container process not found" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:26:44 crc kubenswrapper[4755]: E1006 08:26:44.386150 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 is running failed: container process not found" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:26:44 crc kubenswrapper[4755]: E1006 08:26:44.386200 4755 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-4zm7s" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="registry-server" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.515491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.515655 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.515746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jcqm\" (UniqueName: \"kubernetes.io/projected/41630c1b-822f-4194-a858-b5f9868ad9e6-kube-api-access-6jcqm\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.617168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.617708 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jcqm\" (UniqueName: \"kubernetes.io/projected/41630c1b-822f-4194-a858-b5f9868ad9e6-kube-api-access-6jcqm\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.619795 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.617742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.635472 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/41630c1b-822f-4194-a858-b5f9868ad9e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.643355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jcqm\" (UniqueName: \"kubernetes.io/projected/41630c1b-822f-4194-a858-b5f9868ad9e6-kube-api-access-6jcqm\") pod \"marketplace-operator-79b997595-p4ld2\" (UID: \"41630c1b-822f-4194-a858-b5f9868ad9e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.761295 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.766111 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.769265 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.776268 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.803158 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.855444 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.947865 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content\") pod \"62ecb1af-d269-4f5a-84ec-026b74882414\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948432 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content\") pod \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities\") pod \"62ecb1af-d269-4f5a-84ec-026b74882414\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948589 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content\") pod \"b94e8d7e-d807-4809-ac0e-a219363e15d0\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq55b\" (UniqueName: \"kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b\") pod \"960d9d23-73b6-49b2-8772-eca49d507f2f\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics\") pod \"960d9d23-73b6-49b2-8772-eca49d507f2f\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities\") pod \"b94e8d7e-d807-4809-ac0e-a219363e15d0\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948733 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6nm\" (UniqueName: \"kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm\") pod \"b94e8d7e-d807-4809-ac0e-a219363e15d0\" (UID: \"b94e8d7e-d807-4809-ac0e-a219363e15d0\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities\") pod \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mktmr\" (UniqueName: \"kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr\") pod \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\" (UID: \"6fc654c8-49a2-4815-b0c6-edfc8ac3d836\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948842 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca\") pod \"960d9d23-73b6-49b2-8772-eca49d507f2f\" (UID: \"960d9d23-73b6-49b2-8772-eca49d507f2f\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.948870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlrhm\" (UniqueName: \"kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm\") pod \"62ecb1af-d269-4f5a-84ec-026b74882414\" (UID: \"62ecb1af-d269-4f5a-84ec-026b74882414\") " Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.950374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities" (OuterVolumeSpecName: "utilities") pod "62ecb1af-d269-4f5a-84ec-026b74882414" (UID: "62ecb1af-d269-4f5a-84ec-026b74882414"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.951772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities" (OuterVolumeSpecName: "utilities") pod "b94e8d7e-d807-4809-ac0e-a219363e15d0" (UID: "b94e8d7e-d807-4809-ac0e-a219363e15d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.953033 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "960d9d23-73b6-49b2-8772-eca49d507f2f" (UID: "960d9d23-73b6-49b2-8772-eca49d507f2f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.953628 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities" (OuterVolumeSpecName: "utilities") pod "6fc654c8-49a2-4815-b0c6-edfc8ac3d836" (UID: "6fc654c8-49a2-4815-b0c6-edfc8ac3d836"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.956511 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm" (OuterVolumeSpecName: "kube-api-access-2k6nm") pod "b94e8d7e-d807-4809-ac0e-a219363e15d0" (UID: "b94e8d7e-d807-4809-ac0e-a219363e15d0"). InnerVolumeSpecName "kube-api-access-2k6nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.956547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr" (OuterVolumeSpecName: "kube-api-access-mktmr") pod "6fc654c8-49a2-4815-b0c6-edfc8ac3d836" (UID: "6fc654c8-49a2-4815-b0c6-edfc8ac3d836"). InnerVolumeSpecName "kube-api-access-mktmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.956688 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b" (OuterVolumeSpecName: "kube-api-access-vq55b") pod "960d9d23-73b6-49b2-8772-eca49d507f2f" (UID: "960d9d23-73b6-49b2-8772-eca49d507f2f"). InnerVolumeSpecName "kube-api-access-vq55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:44 crc kubenswrapper[4755]: I1006 08:26:44.956716 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "960d9d23-73b6-49b2-8772-eca49d507f2f" (UID: "960d9d23-73b6-49b2-8772-eca49d507f2f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:44.960970 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm" (OuterVolumeSpecName: "kube-api-access-jlrhm") pod "62ecb1af-d269-4f5a-84ec-026b74882414" (UID: "62ecb1af-d269-4f5a-84ec-026b74882414"). InnerVolumeSpecName "kube-api-access-jlrhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:44.968271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62ecb1af-d269-4f5a-84ec-026b74882414" (UID: "62ecb1af-d269-4f5a-84ec-026b74882414"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.015799 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b94e8d7e-d807-4809-ac0e-a219363e15d0" (UID: "b94e8d7e-d807-4809-ac0e-a219363e15d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.050419 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fc654c8-49a2-4815-b0c6-edfc8ac3d836" (UID: "6fc654c8-49a2-4815-b0c6-edfc8ac3d836"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.052555 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r4sb\" (UniqueName: \"kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb\") pod \"79294028-a667-4a44-bf46-a7597f221243\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.052976 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content\") pod \"79294028-a667-4a44-bf46-a7597f221243\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053010 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities\") pod \"79294028-a667-4a44-bf46-a7597f221243\" (UID: \"79294028-a667-4a44-bf46-a7597f221243\") " Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053202 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053214 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053225 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053236 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq55b\" (UniqueName: \"kubernetes.io/projected/960d9d23-73b6-49b2-8772-eca49d507f2f-kube-api-access-vq55b\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053247 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053257 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94e8d7e-d807-4809-ac0e-a219363e15d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053265 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6nm\" (UniqueName: \"kubernetes.io/projected/b94e8d7e-d807-4809-ac0e-a219363e15d0-kube-api-access-2k6nm\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053273 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053281 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mktmr\" (UniqueName: \"kubernetes.io/projected/6fc654c8-49a2-4815-b0c6-edfc8ac3d836-kube-api-access-mktmr\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053290 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/960d9d23-73b6-49b2-8772-eca49d507f2f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053299 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlrhm\" (UniqueName: \"kubernetes.io/projected/62ecb1af-d269-4f5a-84ec-026b74882414-kube-api-access-jlrhm\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.053307 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ecb1af-d269-4f5a-84ec-026b74882414-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.057395 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb" (OuterVolumeSpecName: "kube-api-access-8r4sb") pod "79294028-a667-4a44-bf46-a7597f221243" (UID: "79294028-a667-4a44-bf46-a7597f221243"). InnerVolumeSpecName "kube-api-access-8r4sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.057614 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities" (OuterVolumeSpecName: "utilities") pod "79294028-a667-4a44-bf46-a7597f221243" (UID: "79294028-a667-4a44-bf46-a7597f221243"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.080799 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4ld2"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.104555 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79294028-a667-4a44-bf46-a7597f221243" (UID: "79294028-a667-4a44-bf46-a7597f221243"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.153919 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.154086 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r4sb\" (UniqueName: \"kubernetes.io/projected/79294028-a667-4a44-bf46-a7597f221243-kube-api-access-8r4sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.154184 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79294028-a667-4a44-bf46-a7597f221243-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.359042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" event={"ID":"41630c1b-822f-4194-a858-b5f9868ad9e6","Type":"ContainerStarted","Data":"51501a8e3f990a4f3ad50ac2fac6371bd729e0511dc062779f6af25570be2c86"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.359105 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.359120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" event={"ID":"41630c1b-822f-4194-a858-b5f9868ad9e6","Type":"ContainerStarted","Data":"b5b3528ca78a75a8fe3b4f523c138e40d3db1ba83ac214110515176d25a4987e"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.361202 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p4ld2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.361267 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" podUID="41630c1b-822f-4194-a858-b5f9868ad9e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.362875 4755 generic.go:334] "Generic (PLEG): container finished" podID="79294028-a667-4a44-bf46-a7597f221243" containerID="bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5" exitCode=0 Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.362938 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jnxh" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.362930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerDied","Data":"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.363111 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jnxh" event={"ID":"79294028-a667-4a44-bf46-a7597f221243","Type":"ContainerDied","Data":"efc76c84d807f7011156d15453669b8ca550b4efe83fe146720e0c5db4f479b0"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.363167 4755 scope.go:117] "RemoveContainer" containerID="bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.365883 4755 generic.go:334] "Generic (PLEG): container finished" podID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerID="c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29" exitCode=0 Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.366146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" event={"ID":"960d9d23-73b6-49b2-8772-eca49d507f2f","Type":"ContainerDied","Data":"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.366190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" event={"ID":"960d9d23-73b6-49b2-8772-eca49d507f2f","Type":"ContainerDied","Data":"e3d0158ad413c7ece37e51fb53945de4b694147a2824fe4616b8f94a2aac7cec"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.366302 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqsmk" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.371132 4755 generic.go:334] "Generic (PLEG): container finished" podID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerID="6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614" exitCode=0 Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.371216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerDied","Data":"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.371251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzsrh" event={"ID":"b94e8d7e-d807-4809-ac0e-a219363e15d0","Type":"ContainerDied","Data":"87360e739176757f1d10b66627cbdfb11031ea0727f89dcc2a9f65593eb1daf7"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.371353 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzsrh" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.383195 4755 generic.go:334] "Generic (PLEG): container finished" podID="62ecb1af-d269-4f5a-84ec-026b74882414" containerID="f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4" exitCode=0 Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.383274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerDied","Data":"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.383313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7dnk" event={"ID":"62ecb1af-d269-4f5a-84ec-026b74882414","Type":"ContainerDied","Data":"18746b12072333b6fd8444dc46005e4f41635744aca81c2850368408fb5d1ae4"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.383416 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7dnk" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.384208 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" podStartSLOduration=1.384186713 podStartE2EDuration="1.384186713s" podCreationTimestamp="2025-10-06 08:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:26:45.381220707 +0000 UTC m=+262.210535921" watchObservedRunningTime="2025-10-06 08:26:45.384186713 +0000 UTC m=+262.213501927" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.390798 4755 generic.go:334] "Generic (PLEG): container finished" podID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" exitCode=0 Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.390894 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerDied","Data":"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.390905 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zm7s" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.390929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zm7s" event={"ID":"6fc654c8-49a2-4815-b0c6-edfc8ac3d836","Type":"ContainerDied","Data":"ffa44547305ccdad2c7bca4e71c455bface3e8dd80791cb86dd358eb06ee6ef4"} Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.395589 4755 scope.go:117] "RemoveContainer" containerID="655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.418809 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.422797 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9jnxh"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.430840 4755 scope.go:117] "RemoveContainer" containerID="0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.439906 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.442475 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dzsrh"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.453372 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.456670 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqsmk"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.466769 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.474384 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7dnk"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.482117 4755 scope.go:117] "RemoveContainer" containerID="bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.483001 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5\": container with ID starting with bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5 not found: ID does not exist" containerID="bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.483049 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5"} err="failed to get container status \"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5\": rpc error: code = NotFound desc = could not find container \"bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5\": container with ID starting with bda4b341445a062be65632c1476498e6d05c14c521829aef0e26c2a6a0fd9ae5 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.483082 4755 scope.go:117] "RemoveContainer" containerID="655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.483676 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e\": container with ID starting with 655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e not found: ID does not exist" containerID="655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.483717 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e"} err="failed to get container status \"655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e\": rpc error: code = NotFound desc = could not find container \"655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e\": container with ID starting with 655ee3e55a58328473728cfaf241409bc8eaf344a50e401897ed4dce6d31817e not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.483747 4755 scope.go:117] "RemoveContainer" containerID="0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.484358 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.484408 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b\": container with ID starting with 0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b not found: ID does not exist" containerID="0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.484487 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b"} err="failed to get container status \"0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b\": rpc error: code = NotFound desc = could not find container \"0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b\": container with ID starting with 0a7817888ad0412212d54727689dfc1afa937e432d47b3de3e629e564ebe8f6b not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.484535 4755 scope.go:117] "RemoveContainer" containerID="c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.486988 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4zm7s"] Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.500146 4755 scope.go:117] "RemoveContainer" containerID="c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.501071 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29\": container with ID starting with c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29 not found: ID does not exist" containerID="c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.501134 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29"} err="failed to get container status \"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29\": rpc error: code = NotFound desc = could not find container \"c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29\": container with ID starting with c5951e7cc98206292843533e28d0abb9e3c14e38fa028c011039d7dbef293a29 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.501166 4755 scope.go:117] "RemoveContainer" containerID="6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.516383 4755 scope.go:117] "RemoveContainer" containerID="5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.548210 4755 scope.go:117] "RemoveContainer" containerID="391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.566200 4755 scope.go:117] "RemoveContainer" containerID="6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.566742 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614\": container with ID starting with 6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614 not found: ID does not exist" containerID="6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.566821 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614"} err="failed to get container status \"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614\": rpc error: code = NotFound desc = could not find container \"6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614\": container with ID starting with 6e63a77e7e70b06e64338ea7e08c16490c03ee4ac8de49af6c75d85a37d02614 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.566871 4755 scope.go:117] "RemoveContainer" containerID="5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.567396 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62\": container with ID starting with 5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62 not found: ID does not exist" containerID="5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.567451 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62"} err="failed to get container status \"5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62\": rpc error: code = NotFound desc = could not find container \"5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62\": container with ID starting with 5014417e0d497baa07c6659a1af1a8761ff2209cdf8b8c910fb52de22ba93a62 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.567491 4755 scope.go:117] "RemoveContainer" containerID="391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.568402 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714\": container with ID starting with 391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714 not found: ID does not exist" containerID="391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.568908 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714"} err="failed to get container status \"391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714\": rpc error: code = NotFound desc = could not find container \"391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714\": container with ID starting with 391852e6f7af816ca1cb8af2649ac473dbc97c165513226499c0f47b1e2d0714 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.568992 4755 scope.go:117] "RemoveContainer" containerID="f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.585323 4755 scope.go:117] "RemoveContainer" containerID="9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.600522 4755 scope.go:117] "RemoveContainer" containerID="d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.619028 4755 scope.go:117] "RemoveContainer" containerID="f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.619485 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4\": container with ID starting with f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4 not found: ID does not exist" containerID="f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.619538 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4"} err="failed to get container status \"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4\": rpc error: code = NotFound desc = could not find container \"f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4\": container with ID starting with f4cba2cf17bdd5eae1d684ae466a5ce7269c0d949aa0689e3a185af8a3e6b3f4 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.619591 4755 scope.go:117] "RemoveContainer" containerID="9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.620063 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753\": container with ID starting with 9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753 not found: ID does not exist" containerID="9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.620087 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753"} err="failed to get container status \"9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753\": rpc error: code = NotFound desc = could not find container \"9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753\": container with ID starting with 9dc8ffdcf6fd145a7b488db8bf4b864ab1e4938b6e50e420c9e883c3fc7a9753 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.620101 4755 scope.go:117] "RemoveContainer" containerID="d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.620454 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61\": container with ID starting with d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61 not found: ID does not exist" containerID="d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.620605 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61"} err="failed to get container status \"d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61\": rpc error: code = NotFound desc = could not find container \"d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61\": container with ID starting with d749fabeb91ef93ef7058912fdf5a1109b0b58d6997e0071dc8ae97821a30b61 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.620713 4755 scope.go:117] "RemoveContainer" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.633704 4755 scope.go:117] "RemoveContainer" containerID="54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.655294 4755 scope.go:117] "RemoveContainer" containerID="4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.678039 4755 scope.go:117] "RemoveContainer" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.679495 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84\": container with ID starting with 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 not found: ID does not exist" containerID="6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.679611 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84"} err="failed to get container status \"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84\": rpc error: code = NotFound desc = could not find container \"6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84\": container with ID starting with 6faa0cb322bb5d16bf43de9c36dfd4e61f668a528e28a72f8996c55048c0fd84 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.679701 4755 scope.go:117] "RemoveContainer" containerID="54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.680176 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17\": container with ID starting with 54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17 not found: ID does not exist" containerID="54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.680219 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17"} err="failed to get container status \"54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17\": rpc error: code = NotFound desc = could not find container \"54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17\": container with ID starting with 54bf5fb45f182ae8ff9e7e41c8e76bc7b611364c6095abe721e174e822b33e17 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.680235 4755 scope.go:117] "RemoveContainer" containerID="4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13" Oct 06 08:26:45 crc kubenswrapper[4755]: E1006 08:26:45.680694 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13\": container with ID starting with 4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13 not found: ID does not exist" containerID="4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.680714 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13"} err="failed to get container status \"4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13\": rpc error: code = NotFound desc = could not find container \"4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13\": container with ID starting with 4e938cb5c3340f91f4a036acd9ad47d37f63c7cccbf0a2b0ddd4895efd7b0f13 not found: ID does not exist" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.895751 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" path="/var/lib/kubelet/pods/62ecb1af-d269-4f5a-84ec-026b74882414/volumes" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.896533 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" path="/var/lib/kubelet/pods/6fc654c8-49a2-4815-b0c6-edfc8ac3d836/volumes" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.897297 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79294028-a667-4a44-bf46-a7597f221243" path="/var/lib/kubelet/pods/79294028-a667-4a44-bf46-a7597f221243/volumes" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.898390 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" path="/var/lib/kubelet/pods/960d9d23-73b6-49b2-8772-eca49d507f2f/volumes" Oct 06 08:26:45 crc kubenswrapper[4755]: I1006 08:26:45.898838 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" path="/var/lib/kubelet/pods/b94e8d7e-d807-4809-ac0e-a219363e15d0/volumes" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.410906 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p4ld2" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.494386 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8z4xp"] Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500473 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500545 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500573 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500586 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500635 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500649 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500684 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500694 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500715 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500725 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500747 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500757 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500772 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500787 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500812 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500821 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="extract-content" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.500838 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.500986 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.501043 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.501054 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.501075 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.501086 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.501130 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.501147 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: E1006 08:26:46.501170 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.501211 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="extract-utilities" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.504639 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79294028-a667-4a44-bf46-a7597f221243" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.504811 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="960d9d23-73b6-49b2-8772-eca49d507f2f" containerName="marketplace-operator" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.504893 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc654c8-49a2-4815-b0c6-edfc8ac3d836" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.504979 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94e8d7e-d807-4809-ac0e-a219363e15d0" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.505045 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ecb1af-d269-4f5a-84ec-026b74882414" containerName="registry-server" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.506988 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.510244 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.518534 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8z4xp"] Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.677393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-catalog-content\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.677526 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-utilities\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.677743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chq5t\" (UniqueName: \"kubernetes.io/projected/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-kube-api-access-chq5t\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.690286 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jbt9h"] Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.696181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.698537 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.700868 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jbt9h"] Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778334 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chq5t\" (UniqueName: \"kubernetes.io/projected/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-kube-api-access-chq5t\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778381 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-catalog-content\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-catalog-content\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-utilities\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2ql\" (UniqueName: \"kubernetes.io/projected/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-kube-api-access-hq2ql\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778482 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-utilities\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-catalog-content\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.778990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-utilities\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.800662 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chq5t\" (UniqueName: \"kubernetes.io/projected/92d7f1d8-b288-4877-9ab3-e3710c46ad0d-kube-api-access-chq5t\") pod \"redhat-marketplace-8z4xp\" (UID: \"92d7f1d8-b288-4877-9ab3-e3710c46ad0d\") " pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.832960 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.880245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2ql\" (UniqueName: \"kubernetes.io/projected/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-kube-api-access-hq2ql\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.880303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-utilities\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.880372 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-catalog-content\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.881124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-utilities\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.881225 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-catalog-content\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:46 crc kubenswrapper[4755]: I1006 08:26:46.896156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2ql\" (UniqueName: \"kubernetes.io/projected/9f819d94-d78c-453c-92f7-2e2f66c4f5b8-kube-api-access-hq2ql\") pod \"redhat-operators-jbt9h\" (UID: \"9f819d94-d78c-453c-92f7-2e2f66c4f5b8\") " pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.010639 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.015025 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8z4xp"] Oct 06 08:26:47 crc kubenswrapper[4755]: W1006 08:26:47.028087 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d7f1d8_b288_4877_9ab3_e3710c46ad0d.slice/crio-ca6dce5babd9115bd9b5d7a979ca185bd487a1d0a9760d33cf7d3cd59a206d71 WatchSource:0}: Error finding container ca6dce5babd9115bd9b5d7a979ca185bd487a1d0a9760d33cf7d3cd59a206d71: Status 404 returned error can't find the container with id ca6dce5babd9115bd9b5d7a979ca185bd487a1d0a9760d33cf7d3cd59a206d71 Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.414798 4755 generic.go:334] "Generic (PLEG): container finished" podID="92d7f1d8-b288-4877-9ab3-e3710c46ad0d" containerID="ce0cb141afe657aa7460ea3ac583bd6acfcf4440b2e643785e82a5837ba65517" exitCode=0 Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.415754 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8z4xp" event={"ID":"92d7f1d8-b288-4877-9ab3-e3710c46ad0d","Type":"ContainerDied","Data":"ce0cb141afe657aa7460ea3ac583bd6acfcf4440b2e643785e82a5837ba65517"} Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.415783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8z4xp" event={"ID":"92d7f1d8-b288-4877-9ab3-e3710c46ad0d","Type":"ContainerStarted","Data":"ca6dce5babd9115bd9b5d7a979ca185bd487a1d0a9760d33cf7d3cd59a206d71"} Oct 06 08:26:47 crc kubenswrapper[4755]: I1006 08:26:47.415805 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jbt9h"] Oct 06 08:26:47 crc kubenswrapper[4755]: W1006 08:26:47.429119 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f819d94_d78c_453c_92f7_2e2f66c4f5b8.slice/crio-b0fb39de9521f9bab2f263ca8556084ed0fac0378d8250053b9bbf281da30de7 WatchSource:0}: Error finding container b0fb39de9521f9bab2f263ca8556084ed0fac0378d8250053b9bbf281da30de7: Status 404 returned error can't find the container with id b0fb39de9521f9bab2f263ca8556084ed0fac0378d8250053b9bbf281da30de7 Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.424007 4755 generic.go:334] "Generic (PLEG): container finished" podID="92d7f1d8-b288-4877-9ab3-e3710c46ad0d" containerID="982fb8e5b1fc7f17a6c092c9e0633bff0123f6c5f5c545dad82cb93d90e72293" exitCode=0 Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.424306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8z4xp" event={"ID":"92d7f1d8-b288-4877-9ab3-e3710c46ad0d","Type":"ContainerDied","Data":"982fb8e5b1fc7f17a6c092c9e0633bff0123f6c5f5c545dad82cb93d90e72293"} Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.426251 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f819d94-d78c-453c-92f7-2e2f66c4f5b8" containerID="91156bec6cc27290d271c28b86cc3c7b7e819ab6764e7f22ec96a7d9ff28d46f" exitCode=0 Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.426285 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbt9h" event={"ID":"9f819d94-d78c-453c-92f7-2e2f66c4f5b8","Type":"ContainerDied","Data":"91156bec6cc27290d271c28b86cc3c7b7e819ab6764e7f22ec96a7d9ff28d46f"} Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.426310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbt9h" event={"ID":"9f819d94-d78c-453c-92f7-2e2f66c4f5b8","Type":"ContainerStarted","Data":"b0fb39de9521f9bab2f263ca8556084ed0fac0378d8250053b9bbf281da30de7"} Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.886466 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p8wxk"] Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.891003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.895951 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:26:48 crc kubenswrapper[4755]: I1006 08:26:48.901737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8wxk"] Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.010650 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-utilities\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.011142 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdtnv\" (UniqueName: \"kubernetes.io/projected/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-kube-api-access-hdtnv\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.011204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-catalog-content\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.086045 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mx9pz"] Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.087019 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.089169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.105622 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx9pz"] Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.113038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-catalog-content\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.113101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-utilities\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.113137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdtnv\" (UniqueName: \"kubernetes.io/projected/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-kube-api-access-hdtnv\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.113791 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-catalog-content\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.113999 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-utilities\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.140299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdtnv\" (UniqueName: \"kubernetes.io/projected/6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08-kube-api-access-hdtnv\") pod \"certified-operators-p8wxk\" (UID: \"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08\") " pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.209206 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.214439 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-catalog-content\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.214493 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwpd\" (UniqueName: \"kubernetes.io/projected/a5c7c681-770b-49f0-aeae-e751bedb73c0-kube-api-access-kgwpd\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.214521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-utilities\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.315413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-utilities\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.316041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-catalog-content\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.316086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwpd\" (UniqueName: \"kubernetes.io/projected/a5c7c681-770b-49f0-aeae-e751bedb73c0-kube-api-access-kgwpd\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.317096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-utilities\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.317122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c7c681-770b-49f0-aeae-e751bedb73c0-catalog-content\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.335260 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwpd\" (UniqueName: \"kubernetes.io/projected/a5c7c681-770b-49f0-aeae-e751bedb73c0-kube-api-access-kgwpd\") pod \"community-operators-mx9pz\" (UID: \"a5c7c681-770b-49f0-aeae-e751bedb73c0\") " pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.404326 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.418709 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8wxk"] Oct 06 08:26:49 crc kubenswrapper[4755]: W1006 08:26:49.434176 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce9263d_ebe4_4a6e_ba20_014e6f7d6b08.slice/crio-a6720eed799305237bb7e984ba3d3c427945c36e05e6fdd06bd977990c4cb79c WatchSource:0}: Error finding container a6720eed799305237bb7e984ba3d3c427945c36e05e6fdd06bd977990c4cb79c: Status 404 returned error can't find the container with id a6720eed799305237bb7e984ba3d3c427945c36e05e6fdd06bd977990c4cb79c Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.436730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbt9h" event={"ID":"9f819d94-d78c-453c-92f7-2e2f66c4f5b8","Type":"ContainerStarted","Data":"5e1ecb24ff6dc86e8e3136331873a1a945d9dd4d603fa00d24630919ddc74e75"} Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.443530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8z4xp" event={"ID":"92d7f1d8-b288-4877-9ab3-e3710c46ad0d","Type":"ContainerStarted","Data":"0dac0274abacc970cefbe77c8fcb192873214c3eb0ce1d5f0751bcef2a7dbd68"} Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.474982 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8z4xp" podStartSLOduration=2.02445142 podStartE2EDuration="3.474962209s" podCreationTimestamp="2025-10-06 08:26:46 +0000 UTC" firstStartedPulling="2025-10-06 08:26:47.41977448 +0000 UTC m=+264.249089694" lastFinishedPulling="2025-10-06 08:26:48.870285269 +0000 UTC m=+265.699600483" observedRunningTime="2025-10-06 08:26:49.473843504 +0000 UTC m=+266.303158728" watchObservedRunningTime="2025-10-06 08:26:49.474962209 +0000 UTC m=+266.304277423" Oct 06 08:26:49 crc kubenswrapper[4755]: I1006 08:26:49.713799 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx9pz"] Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.450215 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f819d94-d78c-453c-92f7-2e2f66c4f5b8" containerID="5e1ecb24ff6dc86e8e3136331873a1a945d9dd4d603fa00d24630919ddc74e75" exitCode=0 Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.450290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbt9h" event={"ID":"9f819d94-d78c-453c-92f7-2e2f66c4f5b8","Type":"ContainerDied","Data":"5e1ecb24ff6dc86e8e3136331873a1a945d9dd4d603fa00d24630919ddc74e75"} Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.467452 4755 generic.go:334] "Generic (PLEG): container finished" podID="6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08" containerID="a97b7c56fe47bf21f79cb7e05f860daaf6b78a755e6397c8ba42dc32431f5a2f" exitCode=0 Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.467555 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8wxk" event={"ID":"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08","Type":"ContainerDied","Data":"a97b7c56fe47bf21f79cb7e05f860daaf6b78a755e6397c8ba42dc32431f5a2f"} Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.467628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8wxk" event={"ID":"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08","Type":"ContainerStarted","Data":"a6720eed799305237bb7e984ba3d3c427945c36e05e6fdd06bd977990c4cb79c"} Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.470502 4755 generic.go:334] "Generic (PLEG): container finished" podID="a5c7c681-770b-49f0-aeae-e751bedb73c0" containerID="39fef6ba2dd7bd40fa81646c4a0787833536ecb9acebc91eea71fdf4e2e47990" exitCode=0 Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.473857 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx9pz" event={"ID":"a5c7c681-770b-49f0-aeae-e751bedb73c0","Type":"ContainerDied","Data":"39fef6ba2dd7bd40fa81646c4a0787833536ecb9acebc91eea71fdf4e2e47990"} Oct 06 08:26:50 crc kubenswrapper[4755]: I1006 08:26:50.473888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx9pz" event={"ID":"a5c7c681-770b-49f0-aeae-e751bedb73c0","Type":"ContainerStarted","Data":"7b49e096fe54b9fe90cf94580282e9a46e0a8562d3986cdbdfa803fceaa522dd"} Oct 06 08:26:51 crc kubenswrapper[4755]: I1006 08:26:51.478645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbt9h" event={"ID":"9f819d94-d78c-453c-92f7-2e2f66c4f5b8","Type":"ContainerStarted","Data":"2eddf9e50302d60adc9ff22dc40c006a002d48945954e33e4d1500c9574689bf"} Oct 06 08:26:51 crc kubenswrapper[4755]: I1006 08:26:51.480756 4755 generic.go:334] "Generic (PLEG): container finished" podID="6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08" containerID="84eddb53eac3b0dc8a4113d4a47221c70d1c00a98e2895f51136c5e21dd25506" exitCode=0 Oct 06 08:26:51 crc kubenswrapper[4755]: I1006 08:26:51.480825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8wxk" event={"ID":"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08","Type":"ContainerDied","Data":"84eddb53eac3b0dc8a4113d4a47221c70d1c00a98e2895f51136c5e21dd25506"} Oct 06 08:26:51 crc kubenswrapper[4755]: I1006 08:26:51.482977 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx9pz" event={"ID":"a5c7c681-770b-49f0-aeae-e751bedb73c0","Type":"ContainerStarted","Data":"d6f38b86282f954fcc4950e1822d711a3637a2e5b326e2cc48291746e6a1c1f3"} Oct 06 08:26:51 crc kubenswrapper[4755]: I1006 08:26:51.519982 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jbt9h" podStartSLOduration=2.985848253 podStartE2EDuration="5.519957635s" podCreationTimestamp="2025-10-06 08:26:46 +0000 UTC" firstStartedPulling="2025-10-06 08:26:48.42787517 +0000 UTC m=+265.257190384" lastFinishedPulling="2025-10-06 08:26:50.961984552 +0000 UTC m=+267.791299766" observedRunningTime="2025-10-06 08:26:51.498084753 +0000 UTC m=+268.327399987" watchObservedRunningTime="2025-10-06 08:26:51.519957635 +0000 UTC m=+268.349272839" Oct 06 08:26:52 crc kubenswrapper[4755]: I1006 08:26:52.489982 4755 generic.go:334] "Generic (PLEG): container finished" podID="a5c7c681-770b-49f0-aeae-e751bedb73c0" containerID="d6f38b86282f954fcc4950e1822d711a3637a2e5b326e2cc48291746e6a1c1f3" exitCode=0 Oct 06 08:26:52 crc kubenswrapper[4755]: I1006 08:26:52.490074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx9pz" event={"ID":"a5c7c681-770b-49f0-aeae-e751bedb73c0","Type":"ContainerDied","Data":"d6f38b86282f954fcc4950e1822d711a3637a2e5b326e2cc48291746e6a1c1f3"} Oct 06 08:26:53 crc kubenswrapper[4755]: I1006 08:26:53.509328 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx9pz" event={"ID":"a5c7c681-770b-49f0-aeae-e751bedb73c0","Type":"ContainerStarted","Data":"7a02f6b72a17388f9753142db7695fb0656b10422f0971a8c57ae4017c243ea7"} Oct 06 08:26:53 crc kubenswrapper[4755]: I1006 08:26:53.513089 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8wxk" event={"ID":"6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08","Type":"ContainerStarted","Data":"67adcdc2bfa9530ac79f7295f344dc6b1a6fa3003999a09019cd119d1827891c"} Oct 06 08:26:53 crc kubenswrapper[4755]: I1006 08:26:53.553584 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mx9pz" podStartSLOduration=2.066169376 podStartE2EDuration="4.553555398s" podCreationTimestamp="2025-10-06 08:26:49 +0000 UTC" firstStartedPulling="2025-10-06 08:26:50.477006537 +0000 UTC m=+267.306321761" lastFinishedPulling="2025-10-06 08:26:52.964392569 +0000 UTC m=+269.793707783" observedRunningTime="2025-10-06 08:26:53.530142413 +0000 UTC m=+270.359457627" watchObservedRunningTime="2025-10-06 08:26:53.553555398 +0000 UTC m=+270.382870612" Oct 06 08:26:53 crc kubenswrapper[4755]: I1006 08:26:53.554423 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p8wxk" podStartSLOduration=4.105513025 podStartE2EDuration="5.554417958s" podCreationTimestamp="2025-10-06 08:26:48 +0000 UTC" firstStartedPulling="2025-10-06 08:26:50.468865908 +0000 UTC m=+267.298181122" lastFinishedPulling="2025-10-06 08:26:51.917770841 +0000 UTC m=+268.747086055" observedRunningTime="2025-10-06 08:26:53.55133071 +0000 UTC m=+270.380645924" watchObservedRunningTime="2025-10-06 08:26:53.554417958 +0000 UTC m=+270.383733172" Oct 06 08:26:56 crc kubenswrapper[4755]: I1006 08:26:56.833924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:56 crc kubenswrapper[4755]: I1006 08:26:56.834420 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:56 crc kubenswrapper[4755]: I1006 08:26:56.891679 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:57 crc kubenswrapper[4755]: I1006 08:26:57.011203 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:57 crc kubenswrapper[4755]: I1006 08:26:57.011308 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:57 crc kubenswrapper[4755]: I1006 08:26:57.048976 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:57 crc kubenswrapper[4755]: I1006 08:26:57.582763 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8z4xp" Oct 06 08:26:57 crc kubenswrapper[4755]: I1006 08:26:57.582837 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jbt9h" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.210766 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.210825 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.264283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.405209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.405253 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.486966 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.591847 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mx9pz" Oct 06 08:26:59 crc kubenswrapper[4755]: I1006 08:26:59.600885 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p8wxk" Oct 06 08:28:18 crc kubenswrapper[4755]: I1006 08:28:18.912131 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:28:18 crc kubenswrapper[4755]: I1006 08:28:18.912756 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:28:48 crc kubenswrapper[4755]: I1006 08:28:48.912457 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:28:48 crc kubenswrapper[4755]: I1006 08:28:48.913276 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:29:10 crc kubenswrapper[4755]: I1006 08:29:10.893359 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dgg8s"] Oct 06 08:29:10 crc kubenswrapper[4755]: I1006 08:29:10.896204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:10 crc kubenswrapper[4755]: I1006 08:29:10.908364 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dgg8s"] Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019222 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47kg\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-kube-api-access-l47kg\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-certificates\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-tls\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019631 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.019800 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-trusted-ca\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.020001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-bound-sa-token\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.050894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121386 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-trusted-ca\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-bound-sa-token\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47kg\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-kube-api-access-l47kg\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-certificates\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.121971 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-tls\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.122039 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.123149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.123523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-trusted-ca\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.123814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-certificates\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.129625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-registry-tls\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.130354 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.138476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47kg\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-kube-api-access-l47kg\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.152405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ad2282c-c6a9-4727-8e82-7fb29fa7e681-bound-sa-token\") pod \"image-registry-66df7c8f76-dgg8s\" (UID: \"8ad2282c-c6a9-4727-8e82-7fb29fa7e681\") " pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.213216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:11 crc kubenswrapper[4755]: I1006 08:29:11.410554 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dgg8s"] Oct 06 08:29:12 crc kubenswrapper[4755]: I1006 08:29:12.356400 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" event={"ID":"8ad2282c-c6a9-4727-8e82-7fb29fa7e681","Type":"ContainerStarted","Data":"7be005975f5b9163a6123077adc5d93b927ced0499b829c545b7dc0125ef8b2a"} Oct 06 08:29:12 crc kubenswrapper[4755]: I1006 08:29:12.357108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" event={"ID":"8ad2282c-c6a9-4727-8e82-7fb29fa7e681","Type":"ContainerStarted","Data":"ee8504e8fefa0f266873202d5192f360ff5f0153d52b61dc5317b450db91897d"} Oct 06 08:29:12 crc kubenswrapper[4755]: I1006 08:29:12.357149 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:12 crc kubenswrapper[4755]: I1006 08:29:12.389029 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" podStartSLOduration=2.388974189 podStartE2EDuration="2.388974189s" podCreationTimestamp="2025-10-06 08:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:29:12.38773048 +0000 UTC m=+409.217045764" watchObservedRunningTime="2025-10-06 08:29:12.388974189 +0000 UTC m=+409.218289413" Oct 06 08:29:18 crc kubenswrapper[4755]: I1006 08:29:18.912371 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:29:18 crc kubenswrapper[4755]: I1006 08:29:18.912726 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:29:18 crc kubenswrapper[4755]: I1006 08:29:18.912774 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:29:18 crc kubenswrapper[4755]: I1006 08:29:18.913293 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:29:18 crc kubenswrapper[4755]: I1006 08:29:18.913367 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18" gracePeriod=600 Oct 06 08:29:19 crc kubenswrapper[4755]: I1006 08:29:19.401893 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18" exitCode=0 Oct 06 08:29:19 crc kubenswrapper[4755]: I1006 08:29:19.402007 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18"} Oct 06 08:29:19 crc kubenswrapper[4755]: I1006 08:29:19.402316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd"} Oct 06 08:29:19 crc kubenswrapper[4755]: I1006 08:29:19.402342 4755 scope.go:117] "RemoveContainer" containerID="33bdb8ee1621b5e0d198a7234c9c15aee9a02ae2df1b8b69c37a96dce650dff2" Oct 06 08:29:31 crc kubenswrapper[4755]: I1006 08:29:31.221666 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dgg8s" Oct 06 08:29:31 crc kubenswrapper[4755]: I1006 08:29:31.303851 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.353498 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" podUID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" containerName="registry" containerID="cri-o://a5e89d04175521116eb204278e2afdc27da65997071619922d124f10a0bb5ed6" gracePeriod=30 Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.620293 4755 generic.go:334] "Generic (PLEG): container finished" podID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" containerID="a5e89d04175521116eb204278e2afdc27da65997071619922d124f10a0bb5ed6" exitCode=0 Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.620376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" event={"ID":"bb3290ed-89c6-4367-a39c-0c8fc61a3f88","Type":"ContainerDied","Data":"a5e89d04175521116eb204278e2afdc27da65997071619922d124f10a0bb5ed6"} Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.704212 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.894405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.894756 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.894933 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.895177 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.895238 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.895434 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.895939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcgvq\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.896015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca\") pod \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\" (UID: \"bb3290ed-89c6-4367-a39c-0c8fc61a3f88\") " Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.898401 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.898706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.900937 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.901612 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq" (OuterVolumeSpecName: "kube-api-access-pcgvq") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "kube-api-access-pcgvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.901953 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.902558 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.910822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.916829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bb3290ed-89c6-4367-a39c-0c8fc61a3f88" (UID: "bb3290ed-89c6-4367-a39c-0c8fc61a3f88"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999093 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999373 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999463 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999539 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999692 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999772 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:56 crc kubenswrapper[4755]: I1006 08:29:56.999845 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcgvq\" (UniqueName: \"kubernetes.io/projected/bb3290ed-89c6-4367-a39c-0c8fc61a3f88-kube-api-access-pcgvq\") on node \"crc\" DevicePath \"\"" Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.631862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" event={"ID":"bb3290ed-89c6-4367-a39c-0c8fc61a3f88","Type":"ContainerDied","Data":"687a6f37132ebf91fae0693c0995f57718e86344e643f5c3f64061382a74ad0e"} Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.631938 4755 scope.go:117] "RemoveContainer" containerID="a5e89d04175521116eb204278e2afdc27da65997071619922d124f10a0bb5ed6" Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.631996 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6zp7" Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.671055 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.676835 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6zp7"] Oct 06 08:29:57 crc kubenswrapper[4755]: I1006 08:29:57.890502 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" path="/var/lib/kubelet/pods/bb3290ed-89c6-4367-a39c-0c8fc61a3f88/volumes" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.148617 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t"] Oct 06 08:30:00 crc kubenswrapper[4755]: E1006 08:30:00.149323 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.149347 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.149538 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3290ed-89c6-4367-a39c-0c8fc61a3f88" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.150146 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.152136 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.152209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.160751 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t"] Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.248646 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.248741 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6xg5\" (UniqueName: \"kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.248916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.350717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6xg5\" (UniqueName: \"kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.350822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.350886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.352060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.361405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.368185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6xg5\" (UniqueName: \"kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5\") pod \"collect-profiles-29328990-nf79t\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.477717 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:00 crc kubenswrapper[4755]: I1006 08:30:00.658874 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t"] Oct 06 08:30:01 crc kubenswrapper[4755]: I1006 08:30:01.661547 4755 generic.go:334] "Generic (PLEG): container finished" podID="8cac01e8-f35b-4518-97e6-358a42a2620d" containerID="a8078a8b193b4df14ea04badc5890279fac83331bdd5c79aedf7387d2e60bbb5" exitCode=0 Oct 06 08:30:01 crc kubenswrapper[4755]: I1006 08:30:01.661652 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" event={"ID":"8cac01e8-f35b-4518-97e6-358a42a2620d","Type":"ContainerDied","Data":"a8078a8b193b4df14ea04badc5890279fac83331bdd5c79aedf7387d2e60bbb5"} Oct 06 08:30:01 crc kubenswrapper[4755]: I1006 08:30:01.662096 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" event={"ID":"8cac01e8-f35b-4518-97e6-358a42a2620d","Type":"ContainerStarted","Data":"73cd297030cb754320933f2696ef3b32ca341e136bedeeee9c698ed0008f7a5f"} Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.873804 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.983397 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume\") pod \"8cac01e8-f35b-4518-97e6-358a42a2620d\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.983485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume\") pod \"8cac01e8-f35b-4518-97e6-358a42a2620d\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.983582 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6xg5\" (UniqueName: \"kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5\") pod \"8cac01e8-f35b-4518-97e6-358a42a2620d\" (UID: \"8cac01e8-f35b-4518-97e6-358a42a2620d\") " Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.984472 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume" (OuterVolumeSpecName: "config-volume") pod "8cac01e8-f35b-4518-97e6-358a42a2620d" (UID: "8cac01e8-f35b-4518-97e6-358a42a2620d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.988326 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8cac01e8-f35b-4518-97e6-358a42a2620d" (UID: "8cac01e8-f35b-4518-97e6-358a42a2620d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:30:02 crc kubenswrapper[4755]: I1006 08:30:02.988425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5" (OuterVolumeSpecName: "kube-api-access-g6xg5") pod "8cac01e8-f35b-4518-97e6-358a42a2620d" (UID: "8cac01e8-f35b-4518-97e6-358a42a2620d"). InnerVolumeSpecName "kube-api-access-g6xg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.085590 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cac01e8-f35b-4518-97e6-358a42a2620d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.085625 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cac01e8-f35b-4518-97e6-358a42a2620d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.085640 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6xg5\" (UniqueName: \"kubernetes.io/projected/8cac01e8-f35b-4518-97e6-358a42a2620d-kube-api-access-g6xg5\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.678489 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" event={"ID":"8cac01e8-f35b-4518-97e6-358a42a2620d","Type":"ContainerDied","Data":"73cd297030cb754320933f2696ef3b32ca341e136bedeeee9c698ed0008f7a5f"} Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.678914 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73cd297030cb754320933f2696ef3b32ca341e136bedeeee9c698ed0008f7a5f" Oct 06 08:30:03 crc kubenswrapper[4755]: I1006 08:30:03.679003 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t" Oct 06 08:31:48 crc kubenswrapper[4755]: I1006 08:31:48.913288 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:31:48 crc kubenswrapper[4755]: I1006 08:31:48.914181 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.620554 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6lntj"] Oct 06 08:32:00 crc kubenswrapper[4755]: E1006 08:32:00.621325 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cac01e8-f35b-4518-97e6-358a42a2620d" containerName="collect-profiles" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.621340 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cac01e8-f35b-4518-97e6-358a42a2620d" containerName="collect-profiles" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.621430 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cac01e8-f35b-4518-97e6-358a42a2620d" containerName="collect-profiles" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.621911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.624651 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jpbbj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.624868 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.625272 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.631873 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6lntj"] Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.638096 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-px4lg"] Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.639036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-px4lg" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.641168 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-txbl5" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.646512 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdqhx"] Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.647355 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.649332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cr6gz" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.654039 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-px4lg"] Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.662112 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdqhx"] Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.782935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn225\" (UniqueName: \"kubernetes.io/projected/2c101652-98d5-42e2-be82-f8058baf0fa9-kube-api-access-mn225\") pod \"cert-manager-webhook-5655c58dd6-xdqhx\" (UID: \"2c101652-98d5-42e2-be82-f8058baf0fa9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.782989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66fv\" (UniqueName: \"kubernetes.io/projected/543c8799-3a5d-49b4-b39e-8e2ca0a055df-kube-api-access-b66fv\") pod \"cert-manager-cainjector-7f985d654d-6lntj\" (UID: \"543c8799-3a5d-49b4-b39e-8e2ca0a055df\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.783156 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgggs\" (UniqueName: \"kubernetes.io/projected/8f030de0-4449-4711-aa8e-9429fc81e43b-kube-api-access-wgggs\") pod \"cert-manager-5b446d88c5-px4lg\" (UID: \"8f030de0-4449-4711-aa8e-9429fc81e43b\") " pod="cert-manager/cert-manager-5b446d88c5-px4lg" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.884201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn225\" (UniqueName: \"kubernetes.io/projected/2c101652-98d5-42e2-be82-f8058baf0fa9-kube-api-access-mn225\") pod \"cert-manager-webhook-5655c58dd6-xdqhx\" (UID: \"2c101652-98d5-42e2-be82-f8058baf0fa9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.884246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66fv\" (UniqueName: \"kubernetes.io/projected/543c8799-3a5d-49b4-b39e-8e2ca0a055df-kube-api-access-b66fv\") pod \"cert-manager-cainjector-7f985d654d-6lntj\" (UID: \"543c8799-3a5d-49b4-b39e-8e2ca0a055df\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.884308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgggs\" (UniqueName: \"kubernetes.io/projected/8f030de0-4449-4711-aa8e-9429fc81e43b-kube-api-access-wgggs\") pod \"cert-manager-5b446d88c5-px4lg\" (UID: \"8f030de0-4449-4711-aa8e-9429fc81e43b\") " pod="cert-manager/cert-manager-5b446d88c5-px4lg" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.906188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66fv\" (UniqueName: \"kubernetes.io/projected/543c8799-3a5d-49b4-b39e-8e2ca0a055df-kube-api-access-b66fv\") pod \"cert-manager-cainjector-7f985d654d-6lntj\" (UID: \"543c8799-3a5d-49b4-b39e-8e2ca0a055df\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.907059 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgggs\" (UniqueName: \"kubernetes.io/projected/8f030de0-4449-4711-aa8e-9429fc81e43b-kube-api-access-wgggs\") pod \"cert-manager-5b446d88c5-px4lg\" (UID: \"8f030de0-4449-4711-aa8e-9429fc81e43b\") " pod="cert-manager/cert-manager-5b446d88c5-px4lg" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.910233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn225\" (UniqueName: \"kubernetes.io/projected/2c101652-98d5-42e2-be82-f8058baf0fa9-kube-api-access-mn225\") pod \"cert-manager-webhook-5655c58dd6-xdqhx\" (UID: \"2c101652-98d5-42e2-be82-f8058baf0fa9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.942646 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.959524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-px4lg" Oct 06 08:32:00 crc kubenswrapper[4755]: I1006 08:32:00.973644 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.363406 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-px4lg"] Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.366183 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6lntj"] Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.372326 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.417634 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdqhx"] Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.418822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-px4lg" event={"ID":"8f030de0-4449-4711-aa8e-9429fc81e43b","Type":"ContainerStarted","Data":"e6fc9163ad5391cbf0381610ea18c9b64cdc9ad098302e26cd9d6eba5527d451"} Oct 06 08:32:01 crc kubenswrapper[4755]: I1006 08:32:01.419827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" event={"ID":"543c8799-3a5d-49b4-b39e-8e2ca0a055df","Type":"ContainerStarted","Data":"93fe2b43cbfd420368840581df405f553721917820f763032a716c438443bab8"} Oct 06 08:32:02 crc kubenswrapper[4755]: I1006 08:32:02.427535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" event={"ID":"2c101652-98d5-42e2-be82-f8058baf0fa9","Type":"ContainerStarted","Data":"c8f6bb72e758bd91799c93872152445e812e6973478ef1b8cc8a4ac68f2f3c2d"} Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.464727 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-px4lg" event={"ID":"8f030de0-4449-4711-aa8e-9429fc81e43b","Type":"ContainerStarted","Data":"31e260af2ae6f603ffc915e3e1508c8ca16f29b03fb1e593a8d9a608801ffc28"} Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.468397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" event={"ID":"543c8799-3a5d-49b4-b39e-8e2ca0a055df","Type":"ContainerStarted","Data":"3f74eee9417f2a2fea0545a79f850aa0716c854fdce8285a2a95d2dddcebeb96"} Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.471169 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" event={"ID":"2c101652-98d5-42e2-be82-f8058baf0fa9","Type":"ContainerStarted","Data":"3f7870620ffb02d18e26c360b66cb95ca6d604876c871b4c15e7fa78af260d20"} Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.472252 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.501693 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-px4lg" podStartSLOduration=2.412044206 podStartE2EDuration="5.501640373s" podCreationTimestamp="2025-10-06 08:32:00 +0000 UTC" firstStartedPulling="2025-10-06 08:32:01.372027799 +0000 UTC m=+578.201343013" lastFinishedPulling="2025-10-06 08:32:04.461623966 +0000 UTC m=+581.290939180" observedRunningTime="2025-10-06 08:32:05.492293611 +0000 UTC m=+582.321608855" watchObservedRunningTime="2025-10-06 08:32:05.501640373 +0000 UTC m=+582.330955617" Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.513794 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-6lntj" podStartSLOduration=2.358499725 podStartE2EDuration="5.513761291s" podCreationTimestamp="2025-10-06 08:32:00 +0000 UTC" firstStartedPulling="2025-10-06 08:32:01.372166603 +0000 UTC m=+578.201481817" lastFinishedPulling="2025-10-06 08:32:04.527428169 +0000 UTC m=+581.356743383" observedRunningTime="2025-10-06 08:32:05.511275202 +0000 UTC m=+582.340590466" watchObservedRunningTime="2025-10-06 08:32:05.513761291 +0000 UTC m=+582.343076565" Oct 06 08:32:05 crc kubenswrapper[4755]: I1006 08:32:05.537079 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" podStartSLOduration=2.9014976040000002 podStartE2EDuration="5.537044884s" podCreationTimestamp="2025-10-06 08:32:00 +0000 UTC" firstStartedPulling="2025-10-06 08:32:01.42509144 +0000 UTC m=+578.254406654" lastFinishedPulling="2025-10-06 08:32:04.06063872 +0000 UTC m=+580.889953934" observedRunningTime="2025-10-06 08:32:05.530487038 +0000 UTC m=+582.359802292" watchObservedRunningTime="2025-10-06 08:32:05.537044884 +0000 UTC m=+582.366360138" Oct 06 08:32:10 crc kubenswrapper[4755]: I1006 08:32:10.976813 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdqhx" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.235227 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r8qq9"] Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.236368 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="nbdb" containerID="cri-o://a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.236834 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.236770 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="sbdb" containerID="cri-o://9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.237005 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-node" containerID="cri-o://63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.237058 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="northd" containerID="cri-o://e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.237086 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-acl-logging" containerID="cri-o://53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.238305 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-controller" containerID="cri-o://5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.271132 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" containerID="cri-o://cb1b1c2195b9c9b6379198f3a3261db7589467cdce5907a8d6e27d4c77ba7723" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.506201 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovnkube-controller/3.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.510728 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-acl-logging/0.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.511478 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-controller/0.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.511987 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="cb1b1c2195b9c9b6379198f3a3261db7589467cdce5907a8d6e27d4c77ba7723" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512083 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512152 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512223 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512354 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512457 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7" exitCode=0 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512521 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f" exitCode=143 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512601 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerID="5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f" exitCode=143 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512092 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"cb1b1c2195b9c9b6379198f3a3261db7589467cdce5907a8d6e27d4c77ba7723"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512766 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512784 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512799 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512808 4755 scope.go:117] "RemoveContainer" containerID="5d1ff3e76cc43cb87a843ffe66a87b27e413c41b79703d90381aa597fcca10cd" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512813 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.512998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.513009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" event={"ID":"b0b431db-f56c-43e6-9f53-fbc28b857422","Type":"ContainerDied","Data":"402b7fc3b000089f7775a166774f0b7b9c7478c425671def97e65a93a6d825c5"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.513023 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402b7fc3b000089f7775a166774f0b7b9c7478c425671def97e65a93a6d825c5" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.515014 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/2.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.515921 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/1.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.515959 4755 generic.go:334] "Generic (PLEG): container finished" podID="891dff9a-4752-4022-83fc-51f626c76991" containerID="8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51" exitCode=2 Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.515983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerDied","Data":"8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51"} Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.516724 4755 scope.go:117] "RemoveContainer" containerID="8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.516937 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r96nx_openshift-multus(891dff9a-4752-4022-83fc-51f626c76991)\"" pod="openshift-multus/multus-r96nx" podUID="891dff9a-4752-4022-83fc-51f626c76991" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.600705 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-acl-logging/0.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.601248 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-controller/0.log" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.601667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.604742 4755 scope.go:117] "RemoveContainer" containerID="252293c04559937fb3bdeb7f0f06764cb74f7d658b1b16705d8dcc071ba9542c" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.654679 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9f95v"] Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655016 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655042 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655054 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655064 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655075 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655095 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kubecfg-setup" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655104 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kubecfg-setup" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655115 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655124 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655141 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655149 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655168 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655176 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655188 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655195 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655210 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655218 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655230 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655239 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655252 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655260 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655375 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655390 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655404 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655415 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655424 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655437 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655447 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655457 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655469 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655478 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655487 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655646 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655660 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.655833 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: E1006 08:32:11.655989 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.656015 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.658540 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.744877 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.744997 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745233 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745276 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745778 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log" (OuterVolumeSpecName: "node-log") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745864 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746045 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746127 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746283 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746433 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22sj\" (UniqueName: \"kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746506 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746598 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746764 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746831 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746902 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746976 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units\") pod \"b0b431db-f56c-43e6-9f53-fbc28b857422\" (UID: \"b0b431db-f56c-43e6-9f53-fbc28b857422\") " Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.745891 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746753 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746796 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash" (OuterVolumeSpecName: "host-slash") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746830 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747231 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746860 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747265 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket" (OuterVolumeSpecName: "log-socket") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.746893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747135 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747288 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747559 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a2632e4-8653-43f7-9519-417d20bac39b-ovn-node-metrics-cert\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747673 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-bin\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747753 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-env-overrides\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747870 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-netns\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-systemd-units\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.747922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748002 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-ovn\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748073 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-netd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748222 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-config\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748363 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-etc-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-var-lib-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748771 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748818 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-kubelet\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-script-lib\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.748959 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d95d7\" (UniqueName: \"kubernetes.io/projected/9a2632e4-8653-43f7-9519-417d20bac39b-kube-api-access-d95d7\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-log-socket\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-slash\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-systemd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-node-log\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749390 4755 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749423 4755 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749436 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749449 4755 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749491 4755 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749506 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749520 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749533 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749550 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749584 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749596 4755 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749608 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749621 4755 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749633 4755 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749647 4755 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749659 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b431db-f56c-43e6-9f53-fbc28b857422-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.749672 4755 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.752584 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.753354 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj" (OuterVolumeSpecName: "kube-api-access-w22sj") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "kube-api-access-w22sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.762493 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b0b431db-f56c-43e6-9f53-fbc28b857422" (UID: "b0b431db-f56c-43e6-9f53-fbc28b857422"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-var-lib-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850847 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-var-lib-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850913 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-kubelet\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.850979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-script-lib\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d95d7\" (UniqueName: \"kubernetes.io/projected/9a2632e4-8653-43f7-9519-417d20bac39b-kube-api-access-d95d7\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-log-socket\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851048 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-kubelet\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851060 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-node-log\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-slash\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-systemd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a2632e4-8653-43f7-9519-417d20bac39b-ovn-node-metrics-cert\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-bin\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-env-overrides\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851210 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-netns\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-systemd-units\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-ovn\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-netd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851315 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-config\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-etc-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851384 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22sj\" (UniqueName: \"kubernetes.io/projected/b0b431db-f56c-43e6-9f53-fbc28b857422-kube-api-access-w22sj\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851396 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b431db-f56c-43e6-9f53-fbc28b857422-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851407 4755 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b431db-f56c-43e6-9f53-fbc28b857422-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851433 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-etc-openvswitch\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851458 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-netns\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851478 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-systemd-units\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-ovn\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-netd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-slash\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851674 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-host-cni-bin\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-node-log\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851700 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-log-socket\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.851750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a2632e4-8653-43f7-9519-417d20bac39b-run-systemd\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.852072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-script-lib\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.852102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-env-overrides\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.852440 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a2632e4-8653-43f7-9519-417d20bac39b-ovnkube-config\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.854331 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a2632e4-8653-43f7-9519-417d20bac39b-ovn-node-metrics-cert\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.873492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d95d7\" (UniqueName: \"kubernetes.io/projected/9a2632e4-8653-43f7-9519-417d20bac39b-kube-api-access-d95d7\") pod \"ovnkube-node-9f95v\" (UID: \"9a2632e4-8653-43f7-9519-417d20bac39b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:11 crc kubenswrapper[4755]: I1006 08:32:11.978513 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.526556 4755 generic.go:334] "Generic (PLEG): container finished" podID="9a2632e4-8653-43f7-9519-417d20bac39b" containerID="1aefb2a01dfe0179983fed5db0bfec49a5e0a7b094642eca69786c1109ac7b8f" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.526685 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerDied","Data":"1aefb2a01dfe0179983fed5db0bfec49a5e0a7b094642eca69786c1109ac7b8f"} Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.526860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"eb33d854d3f012884936935d91e2774dd85830ad297cfdbdee164ced84616176"} Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.531137 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-acl-logging/0.log" Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.531968 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8qq9_b0b431db-f56c-43e6-9f53-fbc28b857422/ovn-controller/0.log" Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.533522 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8qq9" Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.536217 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/2.log" Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.648812 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r8qq9"] Oct 06 08:32:12 crc kubenswrapper[4755]: I1006 08:32:12.654899 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r8qq9"] Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545363 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"8df506d1924f1922c418a805cbb074d1b25f4842ab245827766c149dec79f06a"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"fb16e7cdf1abe991f32f4e45b7dac9e265b64bdcb8bbd56aaaa894d51710365f"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"5724be95c3b2d5069fdd744d87baa91231a4f3189eb1f23f38213e932a5f2287"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545714 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"acb9fbd17d21f4d2b6b5f66ecad89d934ef0ea24264cc3d4c52001664b0e884a"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"d7f5b5dbd3dfd1e0ac8171b1d9f9ecf5cfbd6fe991b13480fc0321e1efb4bb1a"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.545746 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"949da7ea55373707fe0cf9d3a73f024c1ff845fa692c67cbb99170e1be75de90"} Oct 06 08:32:13 crc kubenswrapper[4755]: I1006 08:32:13.884450 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b431db-f56c-43e6-9f53-fbc28b857422" path="/var/lib/kubelet/pods/b0b431db-f56c-43e6-9f53-fbc28b857422/volumes" Oct 06 08:32:15 crc kubenswrapper[4755]: I1006 08:32:15.568655 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"5cb6ea33cb3e19b8bcd10c171644b2f876e8930eac163ef3d05cf5bc25edfa05"} Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.591150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" event={"ID":"9a2632e4-8653-43f7-9519-417d20bac39b","Type":"ContainerStarted","Data":"5d7ad515c80d56c2cac84f7e3221f5580652d5449fdbea0905b80af19b7772d3"} Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.591542 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.591586 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.591601 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.623290 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" podStartSLOduration=7.623266592 podStartE2EDuration="7.623266592s" podCreationTimestamp="2025-10-06 08:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:32:18.621532133 +0000 UTC m=+595.450847357" watchObservedRunningTime="2025-10-06 08:32:18.623266592 +0000 UTC m=+595.452581806" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.624511 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.626288 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.912996 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:32:18 crc kubenswrapper[4755]: I1006 08:32:18.913087 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.089463 4755 scope.go:117] "RemoveContainer" containerID="a8e733504616c927e8301c4dd26be87cbc319637803b822d84eb280dba5cb70c" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.105946 4755 scope.go:117] "RemoveContainer" containerID="8720fed855b7574fe791872410325c49658f9b739fe3efbb9decf307d9e54068" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.120666 4755 scope.go:117] "RemoveContainer" containerID="9e5a743d79e7de9ac299034f458fab6ae88f0efcb4d334bdacb850e23e551d97" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.135162 4755 scope.go:117] "RemoveContainer" containerID="e4c26b556a07fcfb183553a5f3733b0fab32418098088dc1a6529c59f4388101" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.150231 4755 scope.go:117] "RemoveContainer" containerID="c7a90b4a3934614f953c8954401fb1ddc6b0f8e1c37961c27e082085130c98b7" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.168162 4755 scope.go:117] "RemoveContainer" containerID="63f8063b926eb6c72ef41041a58ce92b660ca32a49f179079f34564231bb60b7" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.184345 4755 scope.go:117] "RemoveContainer" containerID="5d43f8585b226ba8dc368ac39ea4b0a74303c8acacc8850ec3fea76cbc4c738f" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.203193 4755 scope.go:117] "RemoveContainer" containerID="cb1b1c2195b9c9b6379198f3a3261db7589467cdce5907a8d6e27d4c77ba7723" Oct 06 08:32:24 crc kubenswrapper[4755]: I1006 08:32:24.220812 4755 scope.go:117] "RemoveContainer" containerID="53e61048b94cfe1b032c03ac0efaed54596fa5d077520c8060068167966aba6f" Oct 06 08:32:26 crc kubenswrapper[4755]: I1006 08:32:26.879310 4755 scope.go:117] "RemoveContainer" containerID="8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51" Oct 06 08:32:26 crc kubenswrapper[4755]: E1006 08:32:26.879628 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r96nx_openshift-multus(891dff9a-4752-4022-83fc-51f626c76991)\"" pod="openshift-multus/multus-r96nx" podUID="891dff9a-4752-4022-83fc-51f626c76991" Oct 06 08:32:41 crc kubenswrapper[4755]: I1006 08:32:41.880541 4755 scope.go:117] "RemoveContainer" containerID="8f5c5a4fe5b9198f4a4c418537672dd9a1cf023530aef141cb92df515748ed51" Oct 06 08:32:42 crc kubenswrapper[4755]: I1006 08:32:42.002871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f95v" Oct 06 08:32:42 crc kubenswrapper[4755]: I1006 08:32:42.768695 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r96nx_891dff9a-4752-4022-83fc-51f626c76991/kube-multus/2.log" Oct 06 08:32:42 crc kubenswrapper[4755]: I1006 08:32:42.769205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r96nx" event={"ID":"891dff9a-4752-4022-83fc-51f626c76991","Type":"ContainerStarted","Data":"c3d2b9431ab2e72c1ed251c8530dc7ae1dd55afc9dfbd6d4d427879ae728b09d"} Oct 06 08:32:48 crc kubenswrapper[4755]: I1006 08:32:48.913014 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:32:48 crc kubenswrapper[4755]: I1006 08:32:48.913667 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:32:48 crc kubenswrapper[4755]: I1006 08:32:48.913737 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:32:48 crc kubenswrapper[4755]: I1006 08:32:48.914491 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:32:48 crc kubenswrapper[4755]: I1006 08:32:48.914623 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd" gracePeriod=600 Oct 06 08:32:49 crc kubenswrapper[4755]: I1006 08:32:49.817792 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd" exitCode=0 Oct 06 08:32:49 crc kubenswrapper[4755]: I1006 08:32:49.817909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd"} Oct 06 08:32:49 crc kubenswrapper[4755]: I1006 08:32:49.818351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec"} Oct 06 08:32:49 crc kubenswrapper[4755]: I1006 08:32:49.818394 4755 scope.go:117] "RemoveContainer" containerID="fec0c4eb81f7712bab171b121e51397b6025d5a32e7a8d750be5c472df105d18" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.267355 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t"] Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.270048 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.275626 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.276710 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t"] Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.471090 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.471155 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s655j\" (UniqueName: \"kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.471180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.573391 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s655j\" (UniqueName: \"kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.573497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.573800 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.574494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.574486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.600276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s655j\" (UniqueName: \"kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:53 crc kubenswrapper[4755]: I1006 08:32:53.891257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:54 crc kubenswrapper[4755]: I1006 08:32:54.137520 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t"] Oct 06 08:32:54 crc kubenswrapper[4755]: W1006 08:32:54.142158 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dffa79_06e6_40e3_9769_541d9af8f0f8.slice/crio-d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2 WatchSource:0}: Error finding container d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2: Status 404 returned error can't find the container with id d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2 Oct 06 08:32:54 crc kubenswrapper[4755]: I1006 08:32:54.856870 4755 generic.go:334] "Generic (PLEG): container finished" podID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerID="33ff18fb2140fc0032f39b235a1f4566c55ff1b6d056131aed9b6f18217549bd" exitCode=0 Oct 06 08:32:54 crc kubenswrapper[4755]: I1006 08:32:54.856957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" event={"ID":"d8dffa79-06e6-40e3-9769-541d9af8f0f8","Type":"ContainerDied","Data":"33ff18fb2140fc0032f39b235a1f4566c55ff1b6d056131aed9b6f18217549bd"} Oct 06 08:32:54 crc kubenswrapper[4755]: I1006 08:32:54.857037 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" event={"ID":"d8dffa79-06e6-40e3-9769-541d9af8f0f8","Type":"ContainerStarted","Data":"d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2"} Oct 06 08:32:56 crc kubenswrapper[4755]: I1006 08:32:56.873813 4755 generic.go:334] "Generic (PLEG): container finished" podID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerID="85de3385fa2ff11cf7fb77a3f8a559911de3e3f136b8d1afe155d6c2eb571a3e" exitCode=0 Oct 06 08:32:56 crc kubenswrapper[4755]: I1006 08:32:56.873887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" event={"ID":"d8dffa79-06e6-40e3-9769-541d9af8f0f8","Type":"ContainerDied","Data":"85de3385fa2ff11cf7fb77a3f8a559911de3e3f136b8d1afe155d6c2eb571a3e"} Oct 06 08:32:57 crc kubenswrapper[4755]: I1006 08:32:57.881868 4755 generic.go:334] "Generic (PLEG): container finished" podID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerID="89f3e97c1acf02aba6799270c783ad28459561b096d9b5af0275927d04cf9512" exitCode=0 Oct 06 08:32:57 crc kubenswrapper[4755]: I1006 08:32:57.889673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" event={"ID":"d8dffa79-06e6-40e3-9769-541d9af8f0f8","Type":"ContainerDied","Data":"89f3e97c1acf02aba6799270c783ad28459561b096d9b5af0275927d04cf9512"} Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.113073 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.248351 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s655j\" (UniqueName: \"kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j\") pod \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.248429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util\") pod \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.248477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle\") pod \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\" (UID: \"d8dffa79-06e6-40e3-9769-541d9af8f0f8\") " Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.249063 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle" (OuterVolumeSpecName: "bundle") pod "d8dffa79-06e6-40e3-9769-541d9af8f0f8" (UID: "d8dffa79-06e6-40e3-9769-541d9af8f0f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.253950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j" (OuterVolumeSpecName: "kube-api-access-s655j") pod "d8dffa79-06e6-40e3-9769-541d9af8f0f8" (UID: "d8dffa79-06e6-40e3-9769-541d9af8f0f8"). InnerVolumeSpecName "kube-api-access-s655j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.271872 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util" (OuterVolumeSpecName: "util") pod "d8dffa79-06e6-40e3-9769-541d9af8f0f8" (UID: "d8dffa79-06e6-40e3-9769-541d9af8f0f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.349544 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s655j\" (UniqueName: \"kubernetes.io/projected/d8dffa79-06e6-40e3-9769-541d9af8f0f8-kube-api-access-s655j\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.349597 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.349611 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8dffa79-06e6-40e3-9769-541d9af8f0f8-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.903806 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" event={"ID":"d8dffa79-06e6-40e3-9769-541d9af8f0f8","Type":"ContainerDied","Data":"d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2"} Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.903860 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d416cf837369c3053511cb029a1edeb6ccb1159fa5f8e5df4b24f72610c943e2" Oct 06 08:32:59 crc kubenswrapper[4755]: I1006 08:32:59.903904 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.177519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kklmt"] Oct 06 08:33:01 crc kubenswrapper[4755]: E1006 08:33:01.178610 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="pull" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.178657 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="pull" Oct 06 08:33:01 crc kubenswrapper[4755]: E1006 08:33:01.179104 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="extract" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.179127 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="extract" Oct 06 08:33:01 crc kubenswrapper[4755]: E1006 08:33:01.179154 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="util" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.179163 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="util" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.179353 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8dffa79-06e6-40e3-9769-541d9af8f0f8" containerName="extract" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.180062 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.185497 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.185801 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.185953 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jkdmj" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.190232 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kklmt"] Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.277736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbg2\" (UniqueName: \"kubernetes.io/projected/3d1e2fed-d6da-41b0-8fb3-216a7563269e-kube-api-access-xcbg2\") pod \"nmstate-operator-858ddd8f98-kklmt\" (UID: \"3d1e2fed-d6da-41b0-8fb3-216a7563269e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.379014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbg2\" (UniqueName: \"kubernetes.io/projected/3d1e2fed-d6da-41b0-8fb3-216a7563269e-kube-api-access-xcbg2\") pod \"nmstate-operator-858ddd8f98-kklmt\" (UID: \"3d1e2fed-d6da-41b0-8fb3-216a7563269e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.399540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbg2\" (UniqueName: \"kubernetes.io/projected/3d1e2fed-d6da-41b0-8fb3-216a7563269e-kube-api-access-xcbg2\") pod \"nmstate-operator-858ddd8f98-kklmt\" (UID: \"3d1e2fed-d6da-41b0-8fb3-216a7563269e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.543096 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.791823 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kklmt"] Oct 06 08:33:01 crc kubenswrapper[4755]: I1006 08:33:01.917689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" event={"ID":"3d1e2fed-d6da-41b0-8fb3-216a7563269e","Type":"ContainerStarted","Data":"166d2c90c18aaebf2abb4a21db6e7b67d67656019e864b44c66479cb90364a6b"} Oct 06 08:33:03 crc kubenswrapper[4755]: I1006 08:33:03.937284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" event={"ID":"3d1e2fed-d6da-41b0-8fb3-216a7563269e","Type":"ContainerStarted","Data":"856bb06b7e1df21f1f6ca89a4500aced264c8999389410748614f793e458d946"} Oct 06 08:33:03 crc kubenswrapper[4755]: I1006 08:33:03.960353 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kklmt" podStartSLOduration=1.002545253 podStartE2EDuration="2.960325321s" podCreationTimestamp="2025-10-06 08:33:01 +0000 UTC" firstStartedPulling="2025-10-06 08:33:01.813855932 +0000 UTC m=+638.643171146" lastFinishedPulling="2025-10-06 08:33:03.771636 +0000 UTC m=+640.600951214" observedRunningTime="2025-10-06 08:33:03.955195185 +0000 UTC m=+640.784510429" watchObservedRunningTime="2025-10-06 08:33:03.960325321 +0000 UTC m=+640.789640555" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.875206 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx"] Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.883771 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.893116 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86"] Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.893132 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-q7rjs" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.897665 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.909077 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx"] Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.912708 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q99m9"] Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.913579 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.916341 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86"] Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.920916 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.929875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.929912 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-dbus-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.929956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-nmstate-lock\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.930010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjb7h\" (UniqueName: \"kubernetes.io/projected/10aaa9f0-000d-46a3-8108-9e1f04820012-kube-api-access-fjb7h\") pod \"nmstate-metrics-fdff9cb8d-2ltnx\" (UID: \"10aaa9f0-000d-46a3-8108-9e1f04820012\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.930051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjqf\" (UniqueName: \"kubernetes.io/projected/5209c2e5-4435-4b41-950e-b1909e4853dc-kube-api-access-bgjqf\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.930087 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdck\" (UniqueName: \"kubernetes.io/projected/d3aab626-14e3-4151-b7ea-7af710945fee-kube-api-access-pgdck\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:04 crc kubenswrapper[4755]: I1006 08:33:04.930137 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-ovs-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.031432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjqf\" (UniqueName: \"kubernetes.io/projected/5209c2e5-4435-4b41-950e-b1909e4853dc-kube-api-access-bgjqf\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.031508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdck\" (UniqueName: \"kubernetes.io/projected/d3aab626-14e3-4151-b7ea-7af710945fee-kube-api-access-pgdck\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.031808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-ovs-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.031871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-ovs-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032048 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032097 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-dbus-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-nmstate-lock\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjb7h\" (UniqueName: \"kubernetes.io/projected/10aaa9f0-000d-46a3-8108-9e1f04820012-kube-api-access-fjb7h\") pod \"nmstate-metrics-fdff9cb8d-2ltnx\" (UID: \"10aaa9f0-000d-46a3-8108-9e1f04820012\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" Oct 06 08:33:05 crc kubenswrapper[4755]: E1006 08:33:05.032321 4755 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 06 08:33:05 crc kubenswrapper[4755]: E1006 08:33:05.032431 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair podName:d3aab626-14e3-4151-b7ea-7af710945fee nodeName:}" failed. No retries permitted until 2025-10-06 08:33:05.532395249 +0000 UTC m=+642.361710463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair") pod "nmstate-webhook-6cdbc54649-dxh86" (UID: "d3aab626-14e3-4151-b7ea-7af710945fee") : secret "openshift-nmstate-webhook" not found Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-dbus-socket\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.032546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5209c2e5-4435-4b41-950e-b1909e4853dc-nmstate-lock\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.053735 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.054687 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.058163 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.058467 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.058749 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-p665r" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.059947 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.066862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdck\" (UniqueName: \"kubernetes.io/projected/d3aab626-14e3-4151-b7ea-7af710945fee-kube-api-access-pgdck\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.067046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjqf\" (UniqueName: \"kubernetes.io/projected/5209c2e5-4435-4b41-950e-b1909e4853dc-kube-api-access-bgjqf\") pod \"nmstate-handler-q99m9\" (UID: \"5209c2e5-4435-4b41-950e-b1909e4853dc\") " pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.067316 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjb7h\" (UniqueName: \"kubernetes.io/projected/10aaa9f0-000d-46a3-8108-9e1f04820012-kube-api-access-fjb7h\") pod \"nmstate-metrics-fdff9cb8d-2ltnx\" (UID: \"10aaa9f0-000d-46a3-8108-9e1f04820012\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.207972 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.240817 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktkf\" (UniqueName: \"kubernetes.io/projected/260d013b-89f5-4f12-959f-fe21b8a52fc6-kube-api-access-mktkf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.240902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/260d013b-89f5-4f12-959f-fe21b8a52fc6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.240942 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/260d013b-89f5-4f12-959f-fe21b8a52fc6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.249251 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d5fd7569-rc8rd"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.250242 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.266713 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-rc8rd"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.284039 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.341980 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-oauth-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342079 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktkf\" (UniqueName: \"kubernetes.io/projected/260d013b-89f5-4f12-959f-fe21b8a52fc6-kube-api-access-mktkf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrcm\" (UniqueName: \"kubernetes.io/projected/6c46e1f6-79bb-4357-9207-9cfd53f174ad-kube-api-access-dfrcm\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/260d013b-89f5-4f12-959f-fe21b8a52fc6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-trusted-ca-bundle\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/260d013b-89f5-4f12-959f-fe21b8a52fc6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342221 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-oauth-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.342239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-service-ca\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.343373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/260d013b-89f5-4f12-959f-fe21b8a52fc6-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.348731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/260d013b-89f5-4f12-959f-fe21b8a52fc6-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.365860 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktkf\" (UniqueName: \"kubernetes.io/projected/260d013b-89f5-4f12-959f-fe21b8a52fc6-kube-api-access-mktkf\") pod \"nmstate-console-plugin-6b874cbd85-vf4h7\" (UID: \"260d013b-89f5-4f12-959f-fe21b8a52fc6\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.400872 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.444291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.444975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-oauth-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445005 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445050 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrcm\" (UniqueName: \"kubernetes.io/projected/6c46e1f6-79bb-4357-9207-9cfd53f174ad-kube-api-access-dfrcm\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445076 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-trusted-ca-bundle\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-oauth-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-service-ca\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445802 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.445985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-service-ca\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.446549 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-oauth-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.447136 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c46e1f6-79bb-4357-9207-9cfd53f174ad-trusted-ca-bundle\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.450294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-serving-cert\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.461197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c46e1f6-79bb-4357-9207-9cfd53f174ad-console-oauth-config\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.462319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.466717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrcm\" (UniqueName: \"kubernetes.io/projected/6c46e1f6-79bb-4357-9207-9cfd53f174ad-kube-api-access-dfrcm\") pod \"console-64d5fd7569-rc8rd\" (UID: \"6c46e1f6-79bb-4357-9207-9cfd53f174ad\") " pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.546406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.553272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3aab626-14e3-4151-b7ea-7af710945fee-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dxh86\" (UID: \"d3aab626-14e3-4151-b7ea-7af710945fee\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.563014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.605701 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.815219 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7"] Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.844430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-rc8rd"] Oct 06 08:33:05 crc kubenswrapper[4755]: W1006 08:33:05.845923 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c46e1f6_79bb_4357_9207_9cfd53f174ad.slice/crio-7e7daa6f3f80f3e50ae69757bac875439ce491136f7a60c9d09864b72e731273 WatchSource:0}: Error finding container 7e7daa6f3f80f3e50ae69757bac875439ce491136f7a60c9d09864b72e731273: Status 404 returned error can't find the container with id 7e7daa6f3f80f3e50ae69757bac875439ce491136f7a60c9d09864b72e731273 Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.950076 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-rc8rd" event={"ID":"6c46e1f6-79bb-4357-9207-9cfd53f174ad","Type":"ContainerStarted","Data":"7e7daa6f3f80f3e50ae69757bac875439ce491136f7a60c9d09864b72e731273"} Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.952197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" event={"ID":"260d013b-89f5-4f12-959f-fe21b8a52fc6","Type":"ContainerStarted","Data":"1bc5febc4883b0164f8152f97a686652f6f60126b5f8e92aac3efa13b02acc3e"} Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.954381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q99m9" event={"ID":"5209c2e5-4435-4b41-950e-b1909e4853dc","Type":"ContainerStarted","Data":"cae431a11cd4982f887f1908906bdd4417025a49a0cc71878bbb6263cbcd8263"} Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.955604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" event={"ID":"10aaa9f0-000d-46a3-8108-9e1f04820012","Type":"ContainerStarted","Data":"34eb3493baf9a7fcf3677a4dbb8cdc4752ddbedf9fa95a42d7129d2a62c5be71"} Oct 06 08:33:05 crc kubenswrapper[4755]: I1006 08:33:05.980599 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86"] Oct 06 08:33:05 crc kubenswrapper[4755]: W1006 08:33:05.989831 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3aab626_14e3_4151_b7ea_7af710945fee.slice/crio-98dee9c6219b2e9a31c1cff25434f375f7d293f3e41ab59bd15bd3dbb6b54249 WatchSource:0}: Error finding container 98dee9c6219b2e9a31c1cff25434f375f7d293f3e41ab59bd15bd3dbb6b54249: Status 404 returned error can't find the container with id 98dee9c6219b2e9a31c1cff25434f375f7d293f3e41ab59bd15bd3dbb6b54249 Oct 06 08:33:06 crc kubenswrapper[4755]: I1006 08:33:06.963922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-rc8rd" event={"ID":"6c46e1f6-79bb-4357-9207-9cfd53f174ad","Type":"ContainerStarted","Data":"32c6e72e420f9c6b1e7f679605fc4e61140c453a3ca202d04fb3ed667b29d29f"} Oct 06 08:33:06 crc kubenswrapper[4755]: I1006 08:33:06.967602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" event={"ID":"d3aab626-14e3-4151-b7ea-7af710945fee","Type":"ContainerStarted","Data":"98dee9c6219b2e9a31c1cff25434f375f7d293f3e41ab59bd15bd3dbb6b54249"} Oct 06 08:33:06 crc kubenswrapper[4755]: I1006 08:33:06.982537 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d5fd7569-rc8rd" podStartSLOduration=1.982514524 podStartE2EDuration="1.982514524s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:33:06.980326124 +0000 UTC m=+643.809641348" watchObservedRunningTime="2025-10-06 08:33:06.982514524 +0000 UTC m=+643.811829738" Oct 06 08:33:07 crc kubenswrapper[4755]: I1006 08:33:07.980140 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" event={"ID":"10aaa9f0-000d-46a3-8108-9e1f04820012","Type":"ContainerStarted","Data":"e0d6f5a9ee94b6e89bebbb3e73053fe6d4e162c8155b08f5b39fb4270a9d819d"} Oct 06 08:33:08 crc kubenswrapper[4755]: I1006 08:33:08.988744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q99m9" event={"ID":"5209c2e5-4435-4b41-950e-b1909e4853dc","Type":"ContainerStarted","Data":"3c1ae52f2a7f415382ac0eee547d325caba31668caaf9056401e1279ad6fd946"} Oct 06 08:33:08 crc kubenswrapper[4755]: I1006 08:33:08.990172 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:08 crc kubenswrapper[4755]: I1006 08:33:08.992178 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" event={"ID":"d3aab626-14e3-4151-b7ea-7af710945fee","Type":"ContainerStarted","Data":"392328437b294f1a56d59fe8df3992a513e05c6120d91cb5488fc96be30f8f0f"} Oct 06 08:33:08 crc kubenswrapper[4755]: I1006 08:33:08.992279 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:08 crc kubenswrapper[4755]: I1006 08:33:08.994626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" event={"ID":"260d013b-89f5-4f12-959f-fe21b8a52fc6","Type":"ContainerStarted","Data":"9fbcc1a03fcfcdfa71dfcba575d2362b219c38e1a8b529bea5ef12a46eaa8db4"} Oct 06 08:33:09 crc kubenswrapper[4755]: I1006 08:33:09.007887 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q99m9" podStartSLOduration=2.561507583 podStartE2EDuration="5.00786259s" podCreationTimestamp="2025-10-06 08:33:04 +0000 UTC" firstStartedPulling="2025-10-06 08:33:05.350437469 +0000 UTC m=+642.179752683" lastFinishedPulling="2025-10-06 08:33:07.796792476 +0000 UTC m=+644.626107690" observedRunningTime="2025-10-06 08:33:09.006409698 +0000 UTC m=+645.835725002" watchObservedRunningTime="2025-10-06 08:33:09.00786259 +0000 UTC m=+645.837177804" Oct 06 08:33:09 crc kubenswrapper[4755]: I1006 08:33:09.029070 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vf4h7" podStartSLOduration=1.147749956 podStartE2EDuration="4.029022649s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="2025-10-06 08:33:05.828837168 +0000 UTC m=+642.658152382" lastFinishedPulling="2025-10-06 08:33:08.710109861 +0000 UTC m=+645.539425075" observedRunningTime="2025-10-06 08:33:09.022328538 +0000 UTC m=+645.851643752" watchObservedRunningTime="2025-10-06 08:33:09.029022649 +0000 UTC m=+645.858337873" Oct 06 08:33:09 crc kubenswrapper[4755]: I1006 08:33:09.050011 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" podStartSLOduration=3.270164684 podStartE2EDuration="5.049980744s" podCreationTimestamp="2025-10-06 08:33:04 +0000 UTC" firstStartedPulling="2025-10-06 08:33:05.996208416 +0000 UTC m=+642.825558961" lastFinishedPulling="2025-10-06 08:33:07.776059797 +0000 UTC m=+644.605375021" observedRunningTime="2025-10-06 08:33:09.04538455 +0000 UTC m=+645.874699764" watchObservedRunningTime="2025-10-06 08:33:09.049980744 +0000 UTC m=+645.879295958" Oct 06 08:33:10 crc kubenswrapper[4755]: I1006 08:33:10.000614 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" event={"ID":"10aaa9f0-000d-46a3-8108-9e1f04820012","Type":"ContainerStarted","Data":"424de08da64be9eab3bd8ade078faa15f2bb69d97b55bc9efd3bfa986524224a"} Oct 06 08:33:10 crc kubenswrapper[4755]: I1006 08:33:10.018923 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2ltnx" podStartSLOduration=1.644659389 podStartE2EDuration="6.018868476s" podCreationTimestamp="2025-10-06 08:33:04 +0000 UTC" firstStartedPulling="2025-10-06 08:33:05.480323909 +0000 UTC m=+642.309639113" lastFinishedPulling="2025-10-06 08:33:09.854532986 +0000 UTC m=+646.683848200" observedRunningTime="2025-10-06 08:33:10.016521464 +0000 UTC m=+646.845836708" watchObservedRunningTime="2025-10-06 08:33:10.018868476 +0000 UTC m=+646.848183690" Oct 06 08:33:15 crc kubenswrapper[4755]: I1006 08:33:15.320646 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q99m9" Oct 06 08:33:15 crc kubenswrapper[4755]: I1006 08:33:15.607254 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:15 crc kubenswrapper[4755]: I1006 08:33:15.607329 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:15 crc kubenswrapper[4755]: I1006 08:33:15.614942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:16 crc kubenswrapper[4755]: I1006 08:33:16.046704 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d5fd7569-rc8rd" Oct 06 08:33:16 crc kubenswrapper[4755]: I1006 08:33:16.123838 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:33:25 crc kubenswrapper[4755]: I1006 08:33:25.572026 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dxh86" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.626169 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6"] Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.628817 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.631213 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.632233 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6"] Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.793584 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.793627 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhpww\" (UniqueName: \"kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.793663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.895070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.895124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhpww\" (UniqueName: \"kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.895391 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.896053 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.896084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.917661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhpww\" (UniqueName: \"kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:38 crc kubenswrapper[4755]: I1006 08:33:38.987318 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:39 crc kubenswrapper[4755]: I1006 08:33:39.440803 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6"] Oct 06 08:33:40 crc kubenswrapper[4755]: I1006 08:33:40.217851 4755 generic.go:334] "Generic (PLEG): container finished" podID="274bae86-c37c-47a1-9f5a-842fec70c251" containerID="805ecda8d8db179e482425402bc43d2e9240efa01eb82aa621379d588ffd1afb" exitCode=0 Oct 06 08:33:40 crc kubenswrapper[4755]: I1006 08:33:40.217944 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" event={"ID":"274bae86-c37c-47a1-9f5a-842fec70c251","Type":"ContainerDied","Data":"805ecda8d8db179e482425402bc43d2e9240efa01eb82aa621379d588ffd1afb"} Oct 06 08:33:40 crc kubenswrapper[4755]: I1006 08:33:40.217983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" event={"ID":"274bae86-c37c-47a1-9f5a-842fec70c251","Type":"ContainerStarted","Data":"a69deb820d772f63520a0547975c7832f33997b9cc2eb25c1b50e0f0a3ffd1d6"} Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.169691 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nrx4l" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerName="console" containerID="cri-o://b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560" gracePeriod=15 Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.590361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nrx4l_d5ef001b-4224-45ce-bdca-5865c9092f0e/console/0.log" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.590844 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.732862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.732951 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.732995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.733966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thch8\" (UniqueName: \"kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734014 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734097 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734149 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca\") pod \"d5ef001b-4224-45ce-bdca-5865c9092f0e\" (UID: \"d5ef001b-4224-45ce-bdca-5865c9092f0e\") " Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734691 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.734981 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config" (OuterVolumeSpecName: "console-config") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.735159 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca" (OuterVolumeSpecName: "service-ca") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.741429 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8" (OuterVolumeSpecName: "kube-api-access-thch8") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "kube-api-access-thch8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.745812 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.746240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d5ef001b-4224-45ce-bdca-5865c9092f0e" (UID: "d5ef001b-4224-45ce-bdca-5865c9092f0e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835844 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835902 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835926 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835945 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835964 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.835982 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thch8\" (UniqueName: \"kubernetes.io/projected/d5ef001b-4224-45ce-bdca-5865c9092f0e-kube-api-access-thch8\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4755]: I1006 08:33:41.836002 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5ef001b-4224-45ce-bdca-5865c9092f0e-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.235028 4755 generic.go:334] "Generic (PLEG): container finished" podID="274bae86-c37c-47a1-9f5a-842fec70c251" containerID="4b47aa917000b4398ee7ba59fd45e1cd6841e59eb26bcb7fe72c31c3626af426" exitCode=0 Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.235220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" event={"ID":"274bae86-c37c-47a1-9f5a-842fec70c251","Type":"ContainerDied","Data":"4b47aa917000b4398ee7ba59fd45e1cd6841e59eb26bcb7fe72c31c3626af426"} Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241635 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nrx4l_d5ef001b-4224-45ce-bdca-5865c9092f0e/console/0.log" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241755 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerID="b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560" exitCode=2 Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241827 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nrx4l" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241833 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nrx4l" event={"ID":"d5ef001b-4224-45ce-bdca-5865c9092f0e","Type":"ContainerDied","Data":"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560"} Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nrx4l" event={"ID":"d5ef001b-4224-45ce-bdca-5865c9092f0e","Type":"ContainerDied","Data":"bb2cafe3c78d783a29f3bc3a41636a09a9f221213ff57c805e3743fe3304f0a0"} Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.241941 4755 scope.go:117] "RemoveContainer" containerID="b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.285215 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.288849 4755 scope.go:117] "RemoveContainer" containerID="b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560" Oct 06 08:33:42 crc kubenswrapper[4755]: E1006 08:33:42.289462 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560\": container with ID starting with b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560 not found: ID does not exist" containerID="b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.289515 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560"} err="failed to get container status \"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560\": rpc error: code = NotFound desc = could not find container \"b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560\": container with ID starting with b05d1d892b0cc2ea6a3610e572f4e0c0bf88d19e103f57103befa4973fe3b560 not found: ID does not exist" Oct 06 08:33:42 crc kubenswrapper[4755]: I1006 08:33:42.290455 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nrx4l"] Oct 06 08:33:43 crc kubenswrapper[4755]: I1006 08:33:43.254236 4755 generic.go:334] "Generic (PLEG): container finished" podID="274bae86-c37c-47a1-9f5a-842fec70c251" containerID="f5ac651c20a3c16530cc321a6b9ecef0fc8761308d99f50e7207db11904bba48" exitCode=0 Oct 06 08:33:43 crc kubenswrapper[4755]: I1006 08:33:43.254284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" event={"ID":"274bae86-c37c-47a1-9f5a-842fec70c251","Type":"ContainerDied","Data":"f5ac651c20a3c16530cc321a6b9ecef0fc8761308d99f50e7207db11904bba48"} Oct 06 08:33:43 crc kubenswrapper[4755]: I1006 08:33:43.891405 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" path="/var/lib/kubelet/pods/d5ef001b-4224-45ce-bdca-5865c9092f0e/volumes" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.494112 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.675332 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle\") pod \"274bae86-c37c-47a1-9f5a-842fec70c251\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.675405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util\") pod \"274bae86-c37c-47a1-9f5a-842fec70c251\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.675532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhpww\" (UniqueName: \"kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww\") pod \"274bae86-c37c-47a1-9f5a-842fec70c251\" (UID: \"274bae86-c37c-47a1-9f5a-842fec70c251\") " Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.678308 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle" (OuterVolumeSpecName: "bundle") pod "274bae86-c37c-47a1-9f5a-842fec70c251" (UID: "274bae86-c37c-47a1-9f5a-842fec70c251"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.684788 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww" (OuterVolumeSpecName: "kube-api-access-mhpww") pod "274bae86-c37c-47a1-9f5a-842fec70c251" (UID: "274bae86-c37c-47a1-9f5a-842fec70c251"). InnerVolumeSpecName "kube-api-access-mhpww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.712463 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util" (OuterVolumeSpecName: "util") pod "274bae86-c37c-47a1-9f5a-842fec70c251" (UID: "274bae86-c37c-47a1-9f5a-842fec70c251"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.777124 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.777158 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/274bae86-c37c-47a1-9f5a-842fec70c251-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:44 crc kubenswrapper[4755]: I1006 08:33:44.777172 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhpww\" (UniqueName: \"kubernetes.io/projected/274bae86-c37c-47a1-9f5a-842fec70c251-kube-api-access-mhpww\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:45 crc kubenswrapper[4755]: I1006 08:33:45.274943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" event={"ID":"274bae86-c37c-47a1-9f5a-842fec70c251","Type":"ContainerDied","Data":"a69deb820d772f63520a0547975c7832f33997b9cc2eb25c1b50e0f0a3ffd1d6"} Oct 06 08:33:45 crc kubenswrapper[4755]: I1006 08:33:45.275015 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a69deb820d772f63520a0547975c7832f33997b9cc2eb25c1b50e0f0a3ffd1d6" Oct 06 08:33:45 crc kubenswrapper[4755]: I1006 08:33:45.275032 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.955218 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk"] Oct 06 08:33:53 crc kubenswrapper[4755]: E1006 08:33:53.955976 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="util" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.955990 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="util" Oct 06 08:33:53 crc kubenswrapper[4755]: E1006 08:33:53.956003 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="pull" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956009 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="pull" Oct 06 08:33:53 crc kubenswrapper[4755]: E1006 08:33:53.956017 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956023 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4755]: E1006 08:33:53.956036 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956044 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="274bae86-c37c-47a1-9f5a-842fec70c251" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956157 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ef001b-4224-45ce-bdca-5865c9092f0e" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.956549 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.958547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.958809 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.958924 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.959087 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.962834 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bc4lx" Oct 06 08:33:53 crc kubenswrapper[4755]: I1006 08:33:53.970611 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk"] Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.093692 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xfh\" (UniqueName: \"kubernetes.io/projected/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-kube-api-access-24xfh\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.093745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-webhook-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.093775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-apiservice-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.195168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-apiservice-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.195642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xfh\" (UniqueName: \"kubernetes.io/projected/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-kube-api-access-24xfh\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.195671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-webhook-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.201715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-apiservice-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.205630 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-688944c458-dszrn"] Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.206430 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: W1006 08:33:54.210133 4755 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 06 08:33:54 crc kubenswrapper[4755]: E1006 08:33:54.210172 4755 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:33:54 crc kubenswrapper[4755]: W1006 08:33:54.210232 4755 reflector.go:561] object-"metallb-system"/"controller-dockercfg-k2r86": failed to list *v1.Secret: secrets "controller-dockercfg-k2r86" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 06 08:33:54 crc kubenswrapper[4755]: E1006 08:33:54.210246 4755 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-k2r86\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-k2r86\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:33:54 crc kubenswrapper[4755]: W1006 08:33:54.210283 4755 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 06 08:33:54 crc kubenswrapper[4755]: E1006 08:33:54.210296 4755 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.212199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-webhook-cert\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.219253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xfh\" (UniqueName: \"kubernetes.io/projected/f3ddc2a6-55c5-42e4-bc84-82413245a1a6-kube-api-access-24xfh\") pod \"metallb-operator-controller-manager-746849c9fd-kwxmk\" (UID: \"f3ddc2a6-55c5-42e4-bc84-82413245a1a6\") " pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.225796 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-688944c458-dszrn"] Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.270977 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.418452 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zv2b\" (UniqueName: \"kubernetes.io/projected/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-kube-api-access-7zv2b\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.418549 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.419160 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.530212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.530282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zv2b\" (UniqueName: \"kubernetes.io/projected/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-kube-api-access-7zv2b\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.530333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.575191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zv2b\" (UniqueName: \"kubernetes.io/projected/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-kube-api-access-7zv2b\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:54 crc kubenswrapper[4755]: I1006 08:33:54.797388 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk"] Oct 06 08:33:54 crc kubenswrapper[4755]: W1006 08:33:54.809748 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ddc2a6_55c5_42e4_bc84_82413245a1a6.slice/crio-0c14617e374105dfe27ecab5aff5a33b31ab5d0da2a749811600c038302a8413 WatchSource:0}: Error finding container 0c14617e374105dfe27ecab5aff5a33b31ab5d0da2a749811600c038302a8413: Status 404 returned error can't find the container with id 0c14617e374105dfe27ecab5aff5a33b31ab5d0da2a749811600c038302a8413 Oct 06 08:33:55 crc kubenswrapper[4755]: I1006 08:33:55.319955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k2r86" Oct 06 08:33:55 crc kubenswrapper[4755]: I1006 08:33:55.338253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" event={"ID":"f3ddc2a6-55c5-42e4-bc84-82413245a1a6","Type":"ContainerStarted","Data":"0c14617e374105dfe27ecab5aff5a33b31ab5d0da2a749811600c038302a8413"} Oct 06 08:33:55 crc kubenswrapper[4755]: E1006 08:33:55.531050 4755 secret.go:188] Couldn't get secret metallb-system/metallb-operator-webhook-server-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 08:33:55 crc kubenswrapper[4755]: E1006 08:33:55.531172 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert podName:fccaa716-5ad5-4994-b4c3-352cc2a11a2e nodeName:}" failed. No retries permitted until 2025-10-06 08:33:56.031149021 +0000 UTC m=+692.860464235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert") pod "metallb-operator-webhook-server-688944c458-dszrn" (UID: "fccaa716-5ad5-4994-b4c3-352cc2a11a2e") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:33:55 crc kubenswrapper[4755]: E1006 08:33:55.531058 4755 secret.go:188] Couldn't get secret metallb-system/metallb-operator-webhook-server-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 08:33:55 crc kubenswrapper[4755]: E1006 08:33:55.531497 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert podName:fccaa716-5ad5-4994-b4c3-352cc2a11a2e nodeName:}" failed. No retries permitted until 2025-10-06 08:33:56.031473269 +0000 UTC m=+692.860788483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert") pod "metallb-operator-webhook-server-688944c458-dszrn" (UID: "fccaa716-5ad5-4994-b4c3-352cc2a11a2e") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:33:55 crc kubenswrapper[4755]: I1006 08:33:55.595872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 08:33:55 crc kubenswrapper[4755]: I1006 08:33:55.625539 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.049349 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.049426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.059260 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-webhook-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.059341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fccaa716-5ad5-4994-b4c3-352cc2a11a2e-apiservice-cert\") pod \"metallb-operator-webhook-server-688944c458-dszrn\" (UID: \"fccaa716-5ad5-4994-b4c3-352cc2a11a2e\") " pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.080722 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:33:56 crc kubenswrapper[4755]: I1006 08:33:56.587769 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-688944c458-dszrn"] Oct 06 08:33:56 crc kubenswrapper[4755]: W1006 08:33:56.593911 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfccaa716_5ad5_4994_b4c3_352cc2a11a2e.slice/crio-95320299b95bfefb7afbe6536e8e2307f6f2b279e8d73be05ccb847db866dbaf WatchSource:0}: Error finding container 95320299b95bfefb7afbe6536e8e2307f6f2b279e8d73be05ccb847db866dbaf: Status 404 returned error can't find the container with id 95320299b95bfefb7afbe6536e8e2307f6f2b279e8d73be05ccb847db866dbaf Oct 06 08:33:57 crc kubenswrapper[4755]: I1006 08:33:57.357210 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" event={"ID":"fccaa716-5ad5-4994-b4c3-352cc2a11a2e","Type":"ContainerStarted","Data":"95320299b95bfefb7afbe6536e8e2307f6f2b279e8d73be05ccb847db866dbaf"} Oct 06 08:33:58 crc kubenswrapper[4755]: I1006 08:33:58.372416 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" event={"ID":"f3ddc2a6-55c5-42e4-bc84-82413245a1a6","Type":"ContainerStarted","Data":"b0698813fdd70f7d0ac3121f96b1fff8cb8f55ca3af296d5d5a8864f8bde1b6c"} Oct 06 08:33:58 crc kubenswrapper[4755]: I1006 08:33:58.372587 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:33:58 crc kubenswrapper[4755]: I1006 08:33:58.399620 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" podStartSLOduration=2.7131760099999997 podStartE2EDuration="5.399596087s" podCreationTimestamp="2025-10-06 08:33:53 +0000 UTC" firstStartedPulling="2025-10-06 08:33:54.811885136 +0000 UTC m=+691.641200350" lastFinishedPulling="2025-10-06 08:33:57.498305213 +0000 UTC m=+694.327620427" observedRunningTime="2025-10-06 08:33:58.390346026 +0000 UTC m=+695.219661240" watchObservedRunningTime="2025-10-06 08:33:58.399596087 +0000 UTC m=+695.228911301" Oct 06 08:34:01 crc kubenswrapper[4755]: I1006 08:34:01.393070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" event={"ID":"fccaa716-5ad5-4994-b4c3-352cc2a11a2e","Type":"ContainerStarted","Data":"63b3eceed482e74092583ecc2ec2efd73579d1552c161b8eb1d1ccdaf5753bdc"} Oct 06 08:34:01 crc kubenswrapper[4755]: I1006 08:34:01.393449 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:34:01 crc kubenswrapper[4755]: I1006 08:34:01.415398 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" podStartSLOduration=3.208048364 podStartE2EDuration="7.415375304s" podCreationTimestamp="2025-10-06 08:33:54 +0000 UTC" firstStartedPulling="2025-10-06 08:33:56.596779753 +0000 UTC m=+693.426094967" lastFinishedPulling="2025-10-06 08:34:00.804106673 +0000 UTC m=+697.633421907" observedRunningTime="2025-10-06 08:34:01.41059256 +0000 UTC m=+698.239907794" watchObservedRunningTime="2025-10-06 08:34:01.415375304 +0000 UTC m=+698.244690518" Oct 06 08:34:16 crc kubenswrapper[4755]: I1006 08:34:16.085639 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-688944c458-dszrn" Oct 06 08:34:34 crc kubenswrapper[4755]: I1006 08:34:34.274190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-746849c9fd-kwxmk" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.017144 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-gt2qg"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.019613 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.024172 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jmnkr" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.024463 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.026533 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.033150 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.033851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.036475 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.045478 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-conf\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17120783-c2c7-4718-8a90-e89951659106-frr-startup\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070758 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-reloader\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwtbw\" (UniqueName: \"kubernetes.io/projected/7c4e25ec-c928-4063-9c6b-2166042d476e-kube-api-access-kwtbw\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070903 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-metrics\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070947 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4z7b\" (UniqueName: \"kubernetes.io/projected/17120783-c2c7-4718-8a90-e89951659106-kube-api-access-r4z7b\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.070969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17120783-c2c7-4718-8a90-e89951659106-metrics-certs\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.071017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-sockets\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.071071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.111870 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4rh89"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.113000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.115073 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.115610 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.117437 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.117462 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-c8mw2" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.135029 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-rlwnm"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.135953 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.139332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.155815 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rlwnm"] Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.171817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-sockets\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.171871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.171893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-cert\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.171916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.171988 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172032 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dc2\" (UniqueName: \"kubernetes.io/projected/0b397346-b157-4cbf-a489-07ba1c76a602-kube-api-access-t8dc2\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172053 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-conf\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17120783-c2c7-4718-8a90-e89951659106-frr-startup\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-reloader\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwtbw\" (UniqueName: \"kubernetes.io/projected/7c4e25ec-c928-4063-9c6b-2166042d476e-kube-api-access-kwtbw\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172142 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b7e2120-cc02-4414-a2e4-55e198617480-metallb-excludel2\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172164 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-metrics\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.172161 4755 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv98q\" (UniqueName: \"kubernetes.io/projected/5b7e2120-cc02-4414-a2e4-55e198617480-kube-api-access-lv98q\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4z7b\" (UniqueName: \"kubernetes.io/projected/17120783-c2c7-4718-8a90-e89951659106-kube-api-access-r4z7b\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17120783-c2c7-4718-8a90-e89951659106-metrics-certs\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.172247 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert podName:7c4e25ec-c928-4063-9c6b-2166042d476e nodeName:}" failed. No retries permitted until 2025-10-06 08:34:35.672219958 +0000 UTC m=+732.501535172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert") pod "frr-k8s-webhook-server-64bf5d555-fft4d" (UID: "7c4e25ec-c928-4063-9c6b-2166042d476e") : secret "frr-k8s-webhook-server-cert" not found Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172304 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-sockets\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172312 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-frr-conf\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172547 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-reloader\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.172784 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17120783-c2c7-4718-8a90-e89951659106-metrics\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.173072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17120783-c2c7-4718-8a90-e89951659106-frr-startup\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.178128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17120783-c2c7-4718-8a90-e89951659106-metrics-certs\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.188630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4z7b\" (UniqueName: \"kubernetes.io/projected/17120783-c2c7-4718-8a90-e89951659106-kube-api-access-r4z7b\") pod \"frr-k8s-gt2qg\" (UID: \"17120783-c2c7-4718-8a90-e89951659106\") " pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.192445 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwtbw\" (UniqueName: \"kubernetes.io/projected/7c4e25ec-c928-4063-9c6b-2166042d476e-kube-api-access-kwtbw\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv98q\" (UniqueName: \"kubernetes.io/projected/5b7e2120-cc02-4414-a2e4-55e198617480-kube-api-access-lv98q\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272769 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-cert\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272874 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dc2\" (UniqueName: \"kubernetes.io/projected/0b397346-b157-4cbf-a489-07ba1c76a602-kube-api-access-t8dc2\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.272913 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b7e2120-cc02-4414-a2e4-55e198617480-metallb-excludel2\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.272975 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.273034 4755 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.273059 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist podName:5b7e2120-cc02-4414-a2e4-55e198617480 nodeName:}" failed. No retries permitted until 2025-10-06 08:34:35.773037189 +0000 UTC m=+732.602352403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist") pod "speaker-4rh89" (UID: "5b7e2120-cc02-4414-a2e4-55e198617480") : secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.273114 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs podName:5b7e2120-cc02-4414-a2e4-55e198617480 nodeName:}" failed. No retries permitted until 2025-10-06 08:34:35.77309409 +0000 UTC m=+732.602409304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs") pod "speaker-4rh89" (UID: "5b7e2120-cc02-4414-a2e4-55e198617480") : secret "speaker-certs-secret" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.273132 4755 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.273187 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs podName:0b397346-b157-4cbf-a489-07ba1c76a602 nodeName:}" failed. No retries permitted until 2025-10-06 08:34:35.773169942 +0000 UTC m=+732.602485246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs") pod "controller-68d546b9d8-rlwnm" (UID: "0b397346-b157-4cbf-a489-07ba1c76a602") : secret "controller-certs-secret" not found Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.273584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b7e2120-cc02-4414-a2e4-55e198617480-metallb-excludel2\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.274663 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.287105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-cert\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.291522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv98q\" (UniqueName: \"kubernetes.io/projected/5b7e2120-cc02-4414-a2e4-55e198617480-kube-api-access-lv98q\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.293179 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dc2\" (UniqueName: \"kubernetes.io/projected/0b397346-b157-4cbf-a489-07ba1c76a602-kube-api-access-t8dc2\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.335362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.585138 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"5b73fc4ac5657056b15e737e84975ff0c77d7a5af66b3ed15b02bdb6c0862606"} Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.677242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.686828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c4e25ec-c928-4063-9c6b-2166042d476e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fft4d\" (UID: \"7c4e25ec-c928-4063-9c6b-2166042d476e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.778681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.778753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.778796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.778838 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4755]: E1006 08:34:35.778903 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist podName:5b7e2120-cc02-4414-a2e4-55e198617480 nodeName:}" failed. No retries permitted until 2025-10-06 08:34:36.778884245 +0000 UTC m=+733.608199459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist") pod "speaker-4rh89" (UID: "5b7e2120-cc02-4414-a2e4-55e198617480") : secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.782022 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b397346-b157-4cbf-a489-07ba1c76a602-metrics-certs\") pod \"controller-68d546b9d8-rlwnm\" (UID: \"0b397346-b157-4cbf-a489-07ba1c76a602\") " pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.782282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-metrics-certs\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:35 crc kubenswrapper[4755]: I1006 08:34:35.947112 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.049311 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.229450 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rlwnm"] Oct 06 08:34:36 crc kubenswrapper[4755]: W1006 08:34:36.235340 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b397346_b157_4cbf_a489_07ba1c76a602.slice/crio-d1b2aa95e32972d0ff21dd988e22820733046e5b21d680f25db060cb83857b17 WatchSource:0}: Error finding container d1b2aa95e32972d0ff21dd988e22820733046e5b21d680f25db060cb83857b17: Status 404 returned error can't find the container with id d1b2aa95e32972d0ff21dd988e22820733046e5b21d680f25db060cb83857b17 Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.327576 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d"] Oct 06 08:34:36 crc kubenswrapper[4755]: W1006 08:34:36.332127 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c4e25ec_c928_4063_9c6b_2166042d476e.slice/crio-e7201bba51af856138c1af69762751cd63cc2a2f3480aae29d1c548cba27ec91 WatchSource:0}: Error finding container e7201bba51af856138c1af69762751cd63cc2a2f3480aae29d1c548cba27ec91: Status 404 returned error can't find the container with id e7201bba51af856138c1af69762751cd63cc2a2f3480aae29d1c548cba27ec91 Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.592759 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rlwnm" event={"ID":"0b397346-b157-4cbf-a489-07ba1c76a602","Type":"ContainerStarted","Data":"4ac002eba0b522d7ed1556306c7232e942b799e81a431b0f68f0948cf10d126c"} Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.593099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rlwnm" event={"ID":"0b397346-b157-4cbf-a489-07ba1c76a602","Type":"ContainerStarted","Data":"f249d71a9503bb20cd68caf028c8dd1d97e3985d652438ee1abb172d9feded46"} Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.593122 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.593136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rlwnm" event={"ID":"0b397346-b157-4cbf-a489-07ba1c76a602","Type":"ContainerStarted","Data":"d1b2aa95e32972d0ff21dd988e22820733046e5b21d680f25db060cb83857b17"} Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.594190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" event={"ID":"7c4e25ec-c928-4063-9c6b-2166042d476e","Type":"ContainerStarted","Data":"e7201bba51af856138c1af69762751cd63cc2a2f3480aae29d1c548cba27ec91"} Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.618263 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-rlwnm" podStartSLOduration=1.618240514 podStartE2EDuration="1.618240514s" podCreationTimestamp="2025-10-06 08:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:36.605786427 +0000 UTC m=+733.435101641" watchObservedRunningTime="2025-10-06 08:34:36.618240514 +0000 UTC m=+733.447555748" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.794529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.800185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b7e2120-cc02-4414-a2e4-55e198617480-memberlist\") pod \"speaker-4rh89\" (UID: \"5b7e2120-cc02-4414-a2e4-55e198617480\") " pod="metallb-system/speaker-4rh89" Oct 06 08:34:36 crc kubenswrapper[4755]: I1006 08:34:36.926118 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4rh89" Oct 06 08:34:36 crc kubenswrapper[4755]: W1006 08:34:36.951065 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b7e2120_cc02_4414_a2e4_55e198617480.slice/crio-ac782aad5ab6cd34359c8d46f86fcf61137e166ad0698f669a722dc6d656578b WatchSource:0}: Error finding container ac782aad5ab6cd34359c8d46f86fcf61137e166ad0698f669a722dc6d656578b: Status 404 returned error can't find the container with id ac782aad5ab6cd34359c8d46f86fcf61137e166ad0698f669a722dc6d656578b Oct 06 08:34:37 crc kubenswrapper[4755]: I1006 08:34:37.603953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4rh89" event={"ID":"5b7e2120-cc02-4414-a2e4-55e198617480","Type":"ContainerStarted","Data":"973f8809c2133567e39d37402d0f9e1f59c91e3e24160a994264e3de048e1c70"} Oct 06 08:34:37 crc kubenswrapper[4755]: I1006 08:34:37.604258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4rh89" event={"ID":"5b7e2120-cc02-4414-a2e4-55e198617480","Type":"ContainerStarted","Data":"7457551c0863da98304f045400950086f7de484ffbb92a53ee5663b1b63d5da9"} Oct 06 08:34:37 crc kubenswrapper[4755]: I1006 08:34:37.604268 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4rh89" event={"ID":"5b7e2120-cc02-4414-a2e4-55e198617480","Type":"ContainerStarted","Data":"ac782aad5ab6cd34359c8d46f86fcf61137e166ad0698f669a722dc6d656578b"} Oct 06 08:34:37 crc kubenswrapper[4755]: I1006 08:34:37.604870 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4rh89" Oct 06 08:34:37 crc kubenswrapper[4755]: I1006 08:34:37.624982 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4rh89" podStartSLOduration=2.6249608269999998 podStartE2EDuration="2.624960827s" podCreationTimestamp="2025-10-06 08:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:37.619257411 +0000 UTC m=+734.448572625" watchObservedRunningTime="2025-10-06 08:34:37.624960827 +0000 UTC m=+734.454276041" Oct 06 08:34:42 crc kubenswrapper[4755]: I1006 08:34:42.639657 4755 generic.go:334] "Generic (PLEG): container finished" podID="17120783-c2c7-4718-8a90-e89951659106" containerID="991e79be297116946ae73713f52592697b0615da767119233b70aa53a0995090" exitCode=0 Oct 06 08:34:42 crc kubenswrapper[4755]: I1006 08:34:42.639744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerDied","Data":"991e79be297116946ae73713f52592697b0615da767119233b70aa53a0995090"} Oct 06 08:34:42 crc kubenswrapper[4755]: I1006 08:34:42.647449 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" event={"ID":"7c4e25ec-c928-4063-9c6b-2166042d476e","Type":"ContainerStarted","Data":"e794a1a04295f66e2303e84ed6b1fde9ddcfb1cfee75142a02c9aa4544e91579"} Oct 06 08:34:42 crc kubenswrapper[4755]: I1006 08:34:42.647619 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:42 crc kubenswrapper[4755]: I1006 08:34:42.687453 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" podStartSLOduration=1.750017802 podStartE2EDuration="7.687426233s" podCreationTimestamp="2025-10-06 08:34:35 +0000 UTC" firstStartedPulling="2025-10-06 08:34:36.334731872 +0000 UTC m=+733.164047086" lastFinishedPulling="2025-10-06 08:34:42.272140303 +0000 UTC m=+739.101455517" observedRunningTime="2025-10-06 08:34:42.683415377 +0000 UTC m=+739.512730591" watchObservedRunningTime="2025-10-06 08:34:42.687426233 +0000 UTC m=+739.516741447" Oct 06 08:34:43 crc kubenswrapper[4755]: I1006 08:34:43.655853 4755 generic.go:334] "Generic (PLEG): container finished" podID="17120783-c2c7-4718-8a90-e89951659106" containerID="e15d99b0ff31781697558b9770706bf285d884cd26fbaac1f04cd3f33b4e5c1f" exitCode=0 Oct 06 08:34:43 crc kubenswrapper[4755]: I1006 08:34:43.655951 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerDied","Data":"e15d99b0ff31781697558b9770706bf285d884cd26fbaac1f04cd3f33b4e5c1f"} Oct 06 08:34:44 crc kubenswrapper[4755]: I1006 08:34:44.665106 4755 generic.go:334] "Generic (PLEG): container finished" podID="17120783-c2c7-4718-8a90-e89951659106" containerID="1e6786ac6690761cbbc3a0c3c36ee1d2239ef30d857e4c36cc79924c9bb3f0bd" exitCode=0 Oct 06 08:34:44 crc kubenswrapper[4755]: I1006 08:34:44.665181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerDied","Data":"1e6786ac6690761cbbc3a0c3c36ee1d2239ef30d857e4c36cc79924c9bb3f0bd"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"784e1a2ed14e19527135ec0cab239b7c62250f409b28bbcdf1890a13d737f7eb"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674714 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"74ab779eb17c7ea7063667fb0b4fc6fbf6712c41bff7f70fa05ddf9b03ccc78a"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"19b7357d4bd3d3fc1d89a0cacda72b401e34c7eb4cee5ff1f66fb89206a990c3"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"d669c63fe36461b7ce7277a454800135dd1bccad5ab614c2db4462197137b275"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"21a8c88d066eedc22b163d62a131ed5a22386e66d252158816712cbe62743663"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.674749 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gt2qg" event={"ID":"17120783-c2c7-4718-8a90-e89951659106","Type":"ContainerStarted","Data":"bbdbaeb9b7254bc57eca2563e9f83aac9fbb42837f5750b87c86b5777c86ec2a"} Oct 06 08:34:45 crc kubenswrapper[4755]: I1006 08:34:45.702772 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-gt2qg" podStartSLOduration=4.92813611 podStartE2EDuration="11.702748248s" podCreationTimestamp="2025-10-06 08:34:34 +0000 UTC" firstStartedPulling="2025-10-06 08:34:35.478378139 +0000 UTC m=+732.307693363" lastFinishedPulling="2025-10-06 08:34:42.252990287 +0000 UTC m=+739.082305501" observedRunningTime="2025-10-06 08:34:45.699162323 +0000 UTC m=+742.528477547" watchObservedRunningTime="2025-10-06 08:34:45.702748248 +0000 UTC m=+742.532063462" Oct 06 08:34:46 crc kubenswrapper[4755]: I1006 08:34:46.053644 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-rlwnm" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.071898 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.072497 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerName="controller-manager" containerID="cri-o://0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15" gracePeriod=30 Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.180397 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.180675 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" podUID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" containerName="route-controller-manager" containerID="cri-o://06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef" gracePeriod=30 Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.505656 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.572120 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.606399 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca\") pod \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.606496 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7rk7\" (UniqueName: \"kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7\") pod \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.606526 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles\") pod \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c98cbede-25b7-40d4-b1ad-18e144e46bcc" (UID: "c98cbede-25b7-40d4-b1ad-18e144e46bcc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607323 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca" (OuterVolumeSpecName: "client-ca") pod "c98cbede-25b7-40d4-b1ad-18e144e46bcc" (UID: "c98cbede-25b7-40d4-b1ad-18e144e46bcc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607555 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config\") pod \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert\") pod \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\" (UID: \"c98cbede-25b7-40d4-b1ad-18e144e46bcc\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607844 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.607863 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.608325 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config" (OuterVolumeSpecName: "config") pod "c98cbede-25b7-40d4-b1ad-18e144e46bcc" (UID: "c98cbede-25b7-40d4-b1ad-18e144e46bcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.613196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7" (OuterVolumeSpecName: "kube-api-access-r7rk7") pod "c98cbede-25b7-40d4-b1ad-18e144e46bcc" (UID: "c98cbede-25b7-40d4-b1ad-18e144e46bcc"). InnerVolumeSpecName "kube-api-access-r7rk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.614308 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c98cbede-25b7-40d4-b1ad-18e144e46bcc" (UID: "c98cbede-25b7-40d4-b1ad-18e144e46bcc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.686638 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-758df45868-55xs8"] Oct 06 08:34:48 crc kubenswrapper[4755]: E1006 08:34:48.686947 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerName="controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.686962 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerName="controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: E1006 08:34:48.686978 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" containerName="route-controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.686985 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" containerName="route-controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.687116 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerName="controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.687134 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" containerName="route-controller-manager" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.687704 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.699044 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.703751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.708609 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config\") pod \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.708690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca\") pod \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.708735 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wm5h\" (UniqueName: \"kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h\") pod \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.708862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert\") pod \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\" (UID: \"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227\") " Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.709155 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98cbede-25b7-40d4-b1ad-18e144e46bcc-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.709182 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98cbede-25b7-40d4-b1ad-18e144e46bcc-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.709196 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7rk7\" (UniqueName: \"kubernetes.io/projected/c98cbede-25b7-40d4-b1ad-18e144e46bcc-kube-api-access-r7rk7\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.709748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca" (OuterVolumeSpecName: "client-ca") pod "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" (UID: "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.710226 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config" (OuterVolumeSpecName: "config") pod "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" (UID: "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.710246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758df45868-55xs8"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.712006 4755 generic.go:334] "Generic (PLEG): container finished" podID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" containerID="06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef" exitCode=0 Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.712095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" event={"ID":"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227","Type":"ContainerDied","Data":"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef"} Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.712168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" event={"ID":"c6c65a52-4ea4-4b9c-b128-3f11b7bc0227","Type":"ContainerDied","Data":"52c738f322f5c636d9687df7528b7e9355ad72e7265c90dde8163fc60a00f320"} Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.712193 4755 scope.go:117] "RemoveContainer" containerID="06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.712464 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.715448 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h" (OuterVolumeSpecName: "kube-api-access-6wm5h") pod "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" (UID: "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227"). InnerVolumeSpecName "kube-api-access-6wm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.716055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" (UID: "c6c65a52-4ea4-4b9c-b128-3f11b7bc0227"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.721219 4755 generic.go:334] "Generic (PLEG): container finished" podID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" containerID="0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15" exitCode=0 Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.721362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" event={"ID":"c98cbede-25b7-40d4-b1ad-18e144e46bcc","Type":"ContainerDied","Data":"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15"} Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.721445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" event={"ID":"c98cbede-25b7-40d4-b1ad-18e144e46bcc","Type":"ContainerDied","Data":"9b6f0d48858e6a81e4877e59a03930926121df05d6b8d8c67e1e623bb9cd576d"} Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.721793 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4skj5" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.725987 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.736590 4755 scope.go:117] "RemoveContainer" containerID="06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef" Oct 06 08:34:48 crc kubenswrapper[4755]: E1006 08:34:48.738936 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef\": container with ID starting with 06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef not found: ID does not exist" containerID="06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.738980 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef"} err="failed to get container status \"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef\": rpc error: code = NotFound desc = could not find container \"06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef\": container with ID starting with 06415cdcb3b3e07252b4c11041d1a58820b52190461e377045bc992d20f8d6ef not found: ID does not exist" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.739009 4755 scope.go:117] "RemoveContainer" containerID="0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.757870 4755 scope.go:117] "RemoveContainer" containerID="0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15" Oct 06 08:34:48 crc kubenswrapper[4755]: E1006 08:34:48.758401 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15\": container with ID starting with 0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15 not found: ID does not exist" containerID="0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.758456 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15"} err="failed to get container status \"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15\": rpc error: code = NotFound desc = could not find container \"0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15\": container with ID starting with 0c6a770ef2710b787fe22e0451b60021c844573cc17ef56bedaa296edfa8ee15 not found: ID does not exist" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.761257 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.764707 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4skj5"] Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.810860 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-proxy-ca-bundles\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811052 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-serving-cert\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-client-ca\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-serving-cert\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6ts\" (UniqueName: \"kubernetes.io/projected/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-kube-api-access-dj6ts\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-config\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tl2d\" (UniqueName: \"kubernetes.io/projected/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-kube-api-access-8tl2d\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811379 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-config\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-client-ca\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811545 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811578 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811588 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wm5h\" (UniqueName: \"kubernetes.io/projected/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-kube-api-access-6wm5h\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.811601 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.912663 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-serving-cert\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.912746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6ts\" (UniqueName: \"kubernetes.io/projected/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-kube-api-access-dj6ts\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.912783 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tl2d\" (UniqueName: \"kubernetes.io/projected/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-kube-api-access-8tl2d\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.912809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-config\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.912843 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-config\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.914156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-config\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.914247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-client-ca\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.914463 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-config\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.914954 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-client-ca\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.915103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-proxy-ca-bundles\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.915203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-serving-cert\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.915255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-client-ca\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.916149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-client-ca\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.917309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-proxy-ca-bundles\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.917479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-serving-cert\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.922259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-serving-cert\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.929462 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tl2d\" (UniqueName: \"kubernetes.io/projected/190eb4a8-9f8d-4007-b4ba-da61d3b0f94d-kube-api-access-8tl2d\") pod \"controller-manager-758df45868-55xs8\" (UID: \"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d\") " pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:48 crc kubenswrapper[4755]: I1006 08:34:48.940868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6ts\" (UniqueName: \"kubernetes.io/projected/3dcb93bd-a775-4aa7-9792-439bdfdf1b20-kube-api-access-dj6ts\") pod \"route-controller-manager-7fd5488f47-kmvjr\" (UID: \"3dcb93bd-a775-4aa7-9792-439bdfdf1b20\") " pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.007507 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.026599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.058629 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.059521 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nnfs"] Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.348703 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758df45868-55xs8"] Oct 06 08:34:49 crc kubenswrapper[4755]: W1006 08:34:49.370248 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod190eb4a8_9f8d_4007_b4ba_da61d3b0f94d.slice/crio-bea58bc014d2935c166edb1d9d3fa9e779345ff822761c434434ae14ff2f3fae WatchSource:0}: Error finding container bea58bc014d2935c166edb1d9d3fa9e779345ff822761c434434ae14ff2f3fae: Status 404 returned error can't find the container with id bea58bc014d2935c166edb1d9d3fa9e779345ff822761c434434ae14ff2f3fae Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.501449 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr"] Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.729705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" event={"ID":"3dcb93bd-a775-4aa7-9792-439bdfdf1b20","Type":"ContainerStarted","Data":"db9bc2d8600d4e3027a8c806a373af988391f9955d6b70c4f568d4a2991017e9"} Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.730092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.730111 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" event={"ID":"3dcb93bd-a775-4aa7-9792-439bdfdf1b20","Type":"ContainerStarted","Data":"b6ca3fac8c3e29201a046ce2aef54cdd7e086c2d21069b9d4752bee308fd075d"} Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.731298 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" event={"ID":"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d","Type":"ContainerStarted","Data":"397676a59c81f0544c27b6069c36f202b4b7b3f5f0c7c2b47d6cba81f45b195c"} Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.731345 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" event={"ID":"190eb4a8-9f8d-4007-b4ba-da61d3b0f94d","Type":"ContainerStarted","Data":"bea58bc014d2935c166edb1d9d3fa9e779345ff822761c434434ae14ff2f3fae"} Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.731503 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.731859 4755 patch_prober.go:28] interesting pod/route-controller-manager-7fd5488f47-kmvjr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.732164 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" podUID="3dcb93bd-a775-4aa7-9792-439bdfdf1b20" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.735234 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.749860 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" podStartSLOduration=1.7498412239999999 podStartE2EDuration="1.749841224s" podCreationTimestamp="2025-10-06 08:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:49.747027347 +0000 UTC m=+746.576342571" watchObservedRunningTime="2025-10-06 08:34:49.749841224 +0000 UTC m=+746.579156438" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.761228 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-758df45868-55xs8" podStartSLOduration=1.761210385 podStartE2EDuration="1.761210385s" podCreationTimestamp="2025-10-06 08:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:49.760585529 +0000 UTC m=+746.589900753" watchObservedRunningTime="2025-10-06 08:34:49.761210385 +0000 UTC m=+746.590525599" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.885511 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c65a52-4ea4-4b9c-b128-3f11b7bc0227" path="/var/lib/kubelet/pods/c6c65a52-4ea4-4b9c-b128-3f11b7bc0227/volumes" Oct 06 08:34:49 crc kubenswrapper[4755]: I1006 08:34:49.886059 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98cbede-25b7-40d4-b1ad-18e144e46bcc" path="/var/lib/kubelet/pods/c98cbede-25b7-40d4-b1ad-18e144e46bcc/volumes" Oct 06 08:34:50 crc kubenswrapper[4755]: I1006 08:34:50.335627 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:50 crc kubenswrapper[4755]: I1006 08:34:50.381958 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:50 crc kubenswrapper[4755]: I1006 08:34:50.745628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fd5488f47-kmvjr" Oct 06 08:34:55 crc kubenswrapper[4755]: I1006 08:34:55.340016 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-gt2qg" Oct 06 08:34:55 crc kubenswrapper[4755]: I1006 08:34:55.953027 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fft4d" Oct 06 08:34:56 crc kubenswrapper[4755]: I1006 08:34:56.768346 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:34:56 crc kubenswrapper[4755]: I1006 08:34:56.929953 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4rh89" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.059873 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.060771 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.062735 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.068204 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.068382 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.185434 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9xl\" (UniqueName: \"kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl\") pod \"openstack-operator-index-4f4f9\" (UID: \"c98b993e-1611-4061-b6b6-48ecc7190551\") " pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.286488 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9xl\" (UniqueName: \"kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl\") pod \"openstack-operator-index-4f4f9\" (UID: \"c98b993e-1611-4061-b6b6-48ecc7190551\") " pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.320478 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9xl\" (UniqueName: \"kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl\") pod \"openstack-operator-index-4f4f9\" (UID: \"c98b993e-1611-4061-b6b6-48ecc7190551\") " pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.376713 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:00 crc kubenswrapper[4755]: I1006 08:35:00.826409 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:01 crc kubenswrapper[4755]: I1006 08:35:01.811522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4f4f9" event={"ID":"c98b993e-1611-4061-b6b6-48ecc7190551","Type":"ContainerStarted","Data":"d39c73c697961eaa551816226138ae19c41f76feca90860ffba621ce104705ac"} Oct 06 08:35:03 crc kubenswrapper[4755]: I1006 08:35:03.446449 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:03 crc kubenswrapper[4755]: I1006 08:35:03.824031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4f4f9" event={"ID":"c98b993e-1611-4061-b6b6-48ecc7190551","Type":"ContainerStarted","Data":"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293"} Oct 06 08:35:03 crc kubenswrapper[4755]: I1006 08:35:03.845589 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4f4f9" podStartSLOduration=1.5842685140000001 podStartE2EDuration="3.845546863s" podCreationTimestamp="2025-10-06 08:35:00 +0000 UTC" firstStartedPulling="2025-10-06 08:35:00.839258963 +0000 UTC m=+757.668574177" lastFinishedPulling="2025-10-06 08:35:03.100537312 +0000 UTC m=+759.929852526" observedRunningTime="2025-10-06 08:35:03.843792372 +0000 UTC m=+760.673107586" watchObservedRunningTime="2025-10-06 08:35:03.845546863 +0000 UTC m=+760.674862077" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.047820 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hvmrb"] Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.048911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.050943 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6vs8q" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.062064 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvmrb"] Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.135585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7xh\" (UniqueName: \"kubernetes.io/projected/8031183e-bb8c-4447-8853-cc9a3b0a771f-kube-api-access-mj7xh\") pod \"openstack-operator-index-hvmrb\" (UID: \"8031183e-bb8c-4447-8853-cc9a3b0a771f\") " pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.237515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7xh\" (UniqueName: \"kubernetes.io/projected/8031183e-bb8c-4447-8853-cc9a3b0a771f-kube-api-access-mj7xh\") pod \"openstack-operator-index-hvmrb\" (UID: \"8031183e-bb8c-4447-8853-cc9a3b0a771f\") " pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.278737 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7xh\" (UniqueName: \"kubernetes.io/projected/8031183e-bb8c-4447-8853-cc9a3b0a771f-kube-api-access-mj7xh\") pod \"openstack-operator-index-hvmrb\" (UID: \"8031183e-bb8c-4447-8853-cc9a3b0a771f\") " pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.378523 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.785995 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvmrb"] Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.830812 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvmrb" event={"ID":"8031183e-bb8c-4447-8853-cc9a3b0a771f","Type":"ContainerStarted","Data":"ad30e8f8bc2c0744e6b33378f891c0c595ae087f8004c4671e017fbf9bff689a"} Oct 06 08:35:04 crc kubenswrapper[4755]: I1006 08:35:04.830919 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-4f4f9" podUID="c98b993e-1611-4061-b6b6-48ecc7190551" containerName="registry-server" containerID="cri-o://d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293" gracePeriod=2 Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.301956 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.456124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp9xl\" (UniqueName: \"kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl\") pod \"c98b993e-1611-4061-b6b6-48ecc7190551\" (UID: \"c98b993e-1611-4061-b6b6-48ecc7190551\") " Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.470405 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl" (OuterVolumeSpecName: "kube-api-access-tp9xl") pod "c98b993e-1611-4061-b6b6-48ecc7190551" (UID: "c98b993e-1611-4061-b6b6-48ecc7190551"). InnerVolumeSpecName "kube-api-access-tp9xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.558444 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp9xl\" (UniqueName: \"kubernetes.io/projected/c98b993e-1611-4061-b6b6-48ecc7190551-kube-api-access-tp9xl\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.844460 4755 generic.go:334] "Generic (PLEG): container finished" podID="c98b993e-1611-4061-b6b6-48ecc7190551" containerID="d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293" exitCode=0 Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.844588 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4f4f9" event={"ID":"c98b993e-1611-4061-b6b6-48ecc7190551","Type":"ContainerDied","Data":"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293"} Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.844696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4f4f9" event={"ID":"c98b993e-1611-4061-b6b6-48ecc7190551","Type":"ContainerDied","Data":"d39c73c697961eaa551816226138ae19c41f76feca90860ffba621ce104705ac"} Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.844613 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4f4f9" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.844725 4755 scope.go:117] "RemoveContainer" containerID="d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.848484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvmrb" event={"ID":"8031183e-bb8c-4447-8853-cc9a3b0a771f","Type":"ContainerStarted","Data":"6054e905bca3154c3f536c3a8a094e97b7baca98bba3571b9bc9651aa6a43ae3"} Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.875475 4755 scope.go:117] "RemoveContainer" containerID="d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293" Oct 06 08:35:05 crc kubenswrapper[4755]: E1006 08:35:05.876394 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293\": container with ID starting with d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293 not found: ID does not exist" containerID="d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.876673 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293"} err="failed to get container status \"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293\": rpc error: code = NotFound desc = could not find container \"d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293\": container with ID starting with d10964b90bb246ab5d756728d9d96c6d9302625687922b26bd0f197aeeaa4293 not found: ID does not exist" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.896525 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hvmrb" podStartSLOduration=1.837245912 podStartE2EDuration="1.896494584s" podCreationTimestamp="2025-10-06 08:35:04 +0000 UTC" firstStartedPulling="2025-10-06 08:35:04.799294355 +0000 UTC m=+761.628609569" lastFinishedPulling="2025-10-06 08:35:04.858543027 +0000 UTC m=+761.687858241" observedRunningTime="2025-10-06 08:35:05.893067462 +0000 UTC m=+762.722382696" watchObservedRunningTime="2025-10-06 08:35:05.896494584 +0000 UTC m=+762.725809808" Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.920676 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:05 crc kubenswrapper[4755]: I1006 08:35:05.929202 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-4f4f9"] Oct 06 08:35:07 crc kubenswrapper[4755]: I1006 08:35:07.893368 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98b993e-1611-4061-b6b6-48ecc7190551" path="/var/lib/kubelet/pods/c98b993e-1611-4061-b6b6-48ecc7190551/volumes" Oct 06 08:35:14 crc kubenswrapper[4755]: I1006 08:35:14.379042 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:14 crc kubenswrapper[4755]: I1006 08:35:14.379672 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:14 crc kubenswrapper[4755]: I1006 08:35:14.418947 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:14 crc kubenswrapper[4755]: I1006 08:35:14.950584 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hvmrb" Oct 06 08:35:18 crc kubenswrapper[4755]: I1006 08:35:18.912254 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:35:18 crc kubenswrapper[4755]: I1006 08:35:18.912845 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.855885 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545"] Oct 06 08:35:20 crc kubenswrapper[4755]: E1006 08:35:20.856556 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98b993e-1611-4061-b6b6-48ecc7190551" containerName="registry-server" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.856589 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98b993e-1611-4061-b6b6-48ecc7190551" containerName="registry-server" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.856915 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98b993e-1611-4061-b6b6-48ecc7190551" containerName="registry-server" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.858605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.862289 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pfk4f" Oct 06 08:35:20 crc kubenswrapper[4755]: I1006 08:35:20.871435 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545"] Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.015427 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.016179 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gfp\" (UniqueName: \"kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.016348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.118040 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.118130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gfp\" (UniqueName: \"kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.118225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.118750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.118903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.144298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gfp\" (UniqueName: \"kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp\") pod \"075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.189536 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.656202 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545"] Oct 06 08:35:21 crc kubenswrapper[4755]: W1006 08:35:21.667169 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e104a0_7135_4172_b7c1_5edd90949112.slice/crio-57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4 WatchSource:0}: Error finding container 57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4: Status 404 returned error can't find the container with id 57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4 Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.971607 4755 generic.go:334] "Generic (PLEG): container finished" podID="a7e104a0-7135-4172-b7c1-5edd90949112" containerID="c0b6fbe60b96e9f361e87edc2ff8f0f92fbd2aaf1cdbe9fe7c9c854760a71464" exitCode=0 Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.971652 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" event={"ID":"a7e104a0-7135-4172-b7c1-5edd90949112","Type":"ContainerDied","Data":"c0b6fbe60b96e9f361e87edc2ff8f0f92fbd2aaf1cdbe9fe7c9c854760a71464"} Oct 06 08:35:21 crc kubenswrapper[4755]: I1006 08:35:21.971678 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" event={"ID":"a7e104a0-7135-4172-b7c1-5edd90949112","Type":"ContainerStarted","Data":"57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4"} Oct 06 08:35:22 crc kubenswrapper[4755]: I1006 08:35:22.984663 4755 generic.go:334] "Generic (PLEG): container finished" podID="a7e104a0-7135-4172-b7c1-5edd90949112" containerID="dbcd410821269e7c41b20a4984587b905ec3fc33eec8b959342ec5d465d55c6e" exitCode=0 Oct 06 08:35:22 crc kubenswrapper[4755]: I1006 08:35:22.984815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" event={"ID":"a7e104a0-7135-4172-b7c1-5edd90949112","Type":"ContainerDied","Data":"dbcd410821269e7c41b20a4984587b905ec3fc33eec8b959342ec5d465d55c6e"} Oct 06 08:35:23 crc kubenswrapper[4755]: I1006 08:35:23.996402 4755 generic.go:334] "Generic (PLEG): container finished" podID="a7e104a0-7135-4172-b7c1-5edd90949112" containerID="4684e30c94ca2df396a7c30314da5120210f2cb8ae501d4577de91c46a465707" exitCode=0 Oct 06 08:35:23 crc kubenswrapper[4755]: I1006 08:35:23.996536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" event={"ID":"a7e104a0-7135-4172-b7c1-5edd90949112","Type":"ContainerDied","Data":"4684e30c94ca2df396a7c30314da5120210f2cb8ae501d4577de91c46a465707"} Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.346355 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.488744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle\") pod \"a7e104a0-7135-4172-b7c1-5edd90949112\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.488884 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util\") pod \"a7e104a0-7135-4172-b7c1-5edd90949112\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.488927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gfp\" (UniqueName: \"kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp\") pod \"a7e104a0-7135-4172-b7c1-5edd90949112\" (UID: \"a7e104a0-7135-4172-b7c1-5edd90949112\") " Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.489474 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle" (OuterVolumeSpecName: "bundle") pod "a7e104a0-7135-4172-b7c1-5edd90949112" (UID: "a7e104a0-7135-4172-b7c1-5edd90949112"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.496529 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp" (OuterVolumeSpecName: "kube-api-access-78gfp") pod "a7e104a0-7135-4172-b7c1-5edd90949112" (UID: "a7e104a0-7135-4172-b7c1-5edd90949112"). InnerVolumeSpecName "kube-api-access-78gfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.501923 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util" (OuterVolumeSpecName: "util") pod "a7e104a0-7135-4172-b7c1-5edd90949112" (UID: "a7e104a0-7135-4172-b7c1-5edd90949112"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.590167 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.590218 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7e104a0-7135-4172-b7c1-5edd90949112-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:25 crc kubenswrapper[4755]: I1006 08:35:25.590233 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gfp\" (UniqueName: \"kubernetes.io/projected/a7e104a0-7135-4172-b7c1-5edd90949112-kube-api-access-78gfp\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:26 crc kubenswrapper[4755]: I1006 08:35:26.012872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" event={"ID":"a7e104a0-7135-4172-b7c1-5edd90949112","Type":"ContainerDied","Data":"57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4"} Oct 06 08:35:26 crc kubenswrapper[4755]: I1006 08:35:26.012908 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545" Oct 06 08:35:26 crc kubenswrapper[4755]: I1006 08:35:26.012914 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ba5e5b4f9af9ff0a53f504f7d42cedb2051869960bf3f0f8ea487d2fb498e4" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.416003 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:29 crc kubenswrapper[4755]: E1006 08:35:29.416541 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="pull" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.416555 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="pull" Oct 06 08:35:29 crc kubenswrapper[4755]: E1006 08:35:29.416583 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="util" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.416633 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="util" Oct 06 08:35:29 crc kubenswrapper[4755]: E1006 08:35:29.416652 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="extract" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.416661 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="extract" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.416791 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e104a0-7135-4172-b7c1-5edd90949112" containerName="extract" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.417672 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.425166 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.462007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.462099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84kfp\" (UniqueName: \"kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.462154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.563467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.564011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84kfp\" (UniqueName: \"kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.564273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.564515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.564819 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.599921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84kfp\" (UniqueName: \"kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp\") pod \"redhat-marketplace-gts28\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:29 crc kubenswrapper[4755]: I1006 08:35:29.770191 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:30 crc kubenswrapper[4755]: I1006 08:35:30.192783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:31 crc kubenswrapper[4755]: I1006 08:35:31.054076 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerID="45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d" exitCode=0 Oct 06 08:35:31 crc kubenswrapper[4755]: I1006 08:35:31.054152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerDied","Data":"45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d"} Oct 06 08:35:31 crc kubenswrapper[4755]: I1006 08:35:31.054448 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerStarted","Data":"acb0c9d9e239e053b9ba8ff2ae4f3c39fa0df410aee62f46cc2228717799d738"} Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.079652 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerID="0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f" exitCode=0 Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.080103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerDied","Data":"0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f"} Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.746073 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2"] Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.747259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.749137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qxznf" Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.768304 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2"] Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.817355 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhdrf\" (UniqueName: \"kubernetes.io/projected/a0cbb4ec-8e43-4a88-a2b6-b516c6017546-kube-api-access-xhdrf\") pod \"openstack-operator-controller-operator-74fdc89789-h88l2\" (UID: \"a0cbb4ec-8e43-4a88-a2b6-b516c6017546\") " pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.918526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhdrf\" (UniqueName: \"kubernetes.io/projected/a0cbb4ec-8e43-4a88-a2b6-b516c6017546-kube-api-access-xhdrf\") pod \"openstack-operator-controller-operator-74fdc89789-h88l2\" (UID: \"a0cbb4ec-8e43-4a88-a2b6-b516c6017546\") " pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:32 crc kubenswrapper[4755]: I1006 08:35:32.940171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhdrf\" (UniqueName: \"kubernetes.io/projected/a0cbb4ec-8e43-4a88-a2b6-b516c6017546-kube-api-access-xhdrf\") pod \"openstack-operator-controller-operator-74fdc89789-h88l2\" (UID: \"a0cbb4ec-8e43-4a88-a2b6-b516c6017546\") " pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.062147 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.103650 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerStarted","Data":"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5"} Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.135500 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gts28" podStartSLOduration=2.6143387799999998 podStartE2EDuration="4.135477733s" podCreationTimestamp="2025-10-06 08:35:29 +0000 UTC" firstStartedPulling="2025-10-06 08:35:31.056096131 +0000 UTC m=+787.885411345" lastFinishedPulling="2025-10-06 08:35:32.577235064 +0000 UTC m=+789.406550298" observedRunningTime="2025-10-06 08:35:33.131041755 +0000 UTC m=+789.960356979" watchObservedRunningTime="2025-10-06 08:35:33.135477733 +0000 UTC m=+789.964792937" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.349210 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2"] Oct 06 08:35:33 crc kubenswrapper[4755]: W1006 08:35:33.352341 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0cbb4ec_8e43_4a88_a2b6_b516c6017546.slice/crio-fbf4ab3cd7f928982606ba8c1cc38d937613044ce904c12a0deff3ff35a00cc9 WatchSource:0}: Error finding container fbf4ab3cd7f928982606ba8c1cc38d937613044ce904c12a0deff3ff35a00cc9: Status 404 returned error can't find the container with id fbf4ab3cd7f928982606ba8c1cc38d937613044ce904c12a0deff3ff35a00cc9 Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.380682 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.382789 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.392841 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.424565 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnq8z\" (UniqueName: \"kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.424652 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.424719 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.526199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.526269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnq8z\" (UniqueName: \"kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.526307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.527027 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.527807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.545584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnq8z\" (UniqueName: \"kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z\") pod \"community-operators-mx66f\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:33 crc kubenswrapper[4755]: I1006 08:35:33.711755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:34 crc kubenswrapper[4755]: I1006 08:35:34.111144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" event={"ID":"a0cbb4ec-8e43-4a88-a2b6-b516c6017546","Type":"ContainerStarted","Data":"fbf4ab3cd7f928982606ba8c1cc38d937613044ce904c12a0deff3ff35a00cc9"} Oct 06 08:35:34 crc kubenswrapper[4755]: I1006 08:35:34.226219 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:35 crc kubenswrapper[4755]: I1006 08:35:35.129293 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerID="0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64" exitCode=0 Oct 06 08:35:35 crc kubenswrapper[4755]: I1006 08:35:35.129636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerDied","Data":"0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64"} Oct 06 08:35:35 crc kubenswrapper[4755]: I1006 08:35:35.129665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerStarted","Data":"540c2aa4cf6e6f39e3e943e657cb18542f178ec4dcfeea373b5026671eb3b2db"} Oct 06 08:35:36 crc kubenswrapper[4755]: I1006 08:35:36.971926 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:36 crc kubenswrapper[4755]: I1006 08:35:36.974131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:36 crc kubenswrapper[4755]: I1006 08:35:36.983999 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.028381 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgsnj\" (UniqueName: \"kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.028438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.028466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.130394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgsnj\" (UniqueName: \"kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.130459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.130488 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.131112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.131872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.158394 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgsnj\" (UniqueName: \"kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj\") pod \"certified-operators-qjw25\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:37 crc kubenswrapper[4755]: I1006 08:35:37.355842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:38 crc kubenswrapper[4755]: I1006 08:35:38.019464 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:38 crc kubenswrapper[4755]: I1006 08:35:38.147455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" event={"ID":"a0cbb4ec-8e43-4a88-a2b6-b516c6017546","Type":"ContainerStarted","Data":"dfd7328f8ec3849a8ebf9bdfefb728723cd8f34e26389e582e760e9be12f19ec"} Oct 06 08:35:38 crc kubenswrapper[4755]: I1006 08:35:38.148506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerStarted","Data":"65defc6e06f2fc0b956b406eff5bd32883af6ee0d46cf736dbf5c35098b3f04b"} Oct 06 08:35:38 crc kubenswrapper[4755]: I1006 08:35:38.152307 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerID="1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88" exitCode=0 Oct 06 08:35:38 crc kubenswrapper[4755]: I1006 08:35:38.152364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerDied","Data":"1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88"} Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.163937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerStarted","Data":"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e"} Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.171317 4755 generic.go:334] "Generic (PLEG): container finished" podID="058fad58-c499-43be-816c-4ad22d92f35f" containerID="76a67a55d99374baf13000d2d9a7f5b57f399435cf35ee612bf293a22d71a155" exitCode=0 Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.171386 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerDied","Data":"76a67a55d99374baf13000d2d9a7f5b57f399435cf35ee612bf293a22d71a155"} Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.195872 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mx66f" podStartSLOduration=3.658811739 podStartE2EDuration="6.195846684s" podCreationTimestamp="2025-10-06 08:35:33 +0000 UTC" firstStartedPulling="2025-10-06 08:35:36.036788528 +0000 UTC m=+792.866103742" lastFinishedPulling="2025-10-06 08:35:38.573823473 +0000 UTC m=+795.403138687" observedRunningTime="2025-10-06 08:35:39.185078322 +0000 UTC m=+796.014393556" watchObservedRunningTime="2025-10-06 08:35:39.195846684 +0000 UTC m=+796.025161888" Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.771230 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.771290 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:39 crc kubenswrapper[4755]: I1006 08:35:39.822111 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:40 crc kubenswrapper[4755]: I1006 08:35:40.240149 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:41 crc kubenswrapper[4755]: I1006 08:35:41.197318 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" event={"ID":"a0cbb4ec-8e43-4a88-a2b6-b516c6017546","Type":"ContainerStarted","Data":"6cb54f550bb3d20f1b269a3416ecae7a2a8bce7456062d06fdf52a720af4f82c"} Oct 06 08:35:41 crc kubenswrapper[4755]: I1006 08:35:41.197720 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:41 crc kubenswrapper[4755]: I1006 08:35:41.199984 4755 generic.go:334] "Generic (PLEG): container finished" podID="058fad58-c499-43be-816c-4ad22d92f35f" containerID="134a40303d745633857d9cc2693319d4e838433cd88691b8c50f45bccfa9a844" exitCode=0 Oct 06 08:35:41 crc kubenswrapper[4755]: I1006 08:35:41.200127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerDied","Data":"134a40303d745633857d9cc2693319d4e838433cd88691b8c50f45bccfa9a844"} Oct 06 08:35:41 crc kubenswrapper[4755]: I1006 08:35:41.240364 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" podStartSLOduration=2.470394045 podStartE2EDuration="9.240347537s" podCreationTimestamp="2025-10-06 08:35:32 +0000 UTC" firstStartedPulling="2025-10-06 08:35:33.356489519 +0000 UTC m=+790.185804733" lastFinishedPulling="2025-10-06 08:35:40.126443011 +0000 UTC m=+796.955758225" observedRunningTime="2025-10-06 08:35:41.235797127 +0000 UTC m=+798.065112351" watchObservedRunningTime="2025-10-06 08:35:41.240347537 +0000 UTC m=+798.069662751" Oct 06 08:35:42 crc kubenswrapper[4755]: I1006 08:35:42.209766 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerStarted","Data":"2ff4f60cc0f98a99f2da203e84f7005c4b5bd0d1a984bfffde2e860bc5891010"} Oct 06 08:35:42 crc kubenswrapper[4755]: I1006 08:35:42.211010 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-74fdc89789-h88l2" Oct 06 08:35:42 crc kubenswrapper[4755]: I1006 08:35:42.226964 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjw25" podStartSLOduration=3.813224682 podStartE2EDuration="6.226944697s" podCreationTimestamp="2025-10-06 08:35:36 +0000 UTC" firstStartedPulling="2025-10-06 08:35:39.212982951 +0000 UTC m=+796.042298165" lastFinishedPulling="2025-10-06 08:35:41.626702966 +0000 UTC m=+798.456018180" observedRunningTime="2025-10-06 08:35:42.226389643 +0000 UTC m=+799.055704897" watchObservedRunningTime="2025-10-06 08:35:42.226944697 +0000 UTC m=+799.056259931" Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.562756 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.563261 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gts28" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="registry-server" containerID="cri-o://679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5" gracePeriod=2 Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.712622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.712714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.765471 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:43 crc kubenswrapper[4755]: I1006 08:35:43.982426 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.049400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84kfp\" (UniqueName: \"kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp\") pod \"e3c15aea-bec5-47a6-ac2c-20af52549a69\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.049471 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities\") pod \"e3c15aea-bec5-47a6-ac2c-20af52549a69\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.049499 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content\") pod \"e3c15aea-bec5-47a6-ac2c-20af52549a69\" (UID: \"e3c15aea-bec5-47a6-ac2c-20af52549a69\") " Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.050979 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities" (OuterVolumeSpecName: "utilities") pod "e3c15aea-bec5-47a6-ac2c-20af52549a69" (UID: "e3c15aea-bec5-47a6-ac2c-20af52549a69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.055955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp" (OuterVolumeSpecName: "kube-api-access-84kfp") pod "e3c15aea-bec5-47a6-ac2c-20af52549a69" (UID: "e3c15aea-bec5-47a6-ac2c-20af52549a69"). InnerVolumeSpecName "kube-api-access-84kfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.062957 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3c15aea-bec5-47a6-ac2c-20af52549a69" (UID: "e3c15aea-bec5-47a6-ac2c-20af52549a69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.151800 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84kfp\" (UniqueName: \"kubernetes.io/projected/e3c15aea-bec5-47a6-ac2c-20af52549a69-kube-api-access-84kfp\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.151833 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.151844 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c15aea-bec5-47a6-ac2c-20af52549a69-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.222631 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerID="679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5" exitCode=0 Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.222765 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gts28" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.222841 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerDied","Data":"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5"} Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.222909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gts28" event={"ID":"e3c15aea-bec5-47a6-ac2c-20af52549a69","Type":"ContainerDied","Data":"acb0c9d9e239e053b9ba8ff2ae4f3c39fa0df410aee62f46cc2228717799d738"} Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.222941 4755 scope.go:117] "RemoveContainer" containerID="679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.238195 4755 scope.go:117] "RemoveContainer" containerID="0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.261123 4755 scope.go:117] "RemoveContainer" containerID="45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.261741 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.264869 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gts28"] Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.281561 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.298834 4755 scope.go:117] "RemoveContainer" containerID="679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5" Oct 06 08:35:44 crc kubenswrapper[4755]: E1006 08:35:44.299689 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5\": container with ID starting with 679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5 not found: ID does not exist" containerID="679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.299849 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5"} err="failed to get container status \"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5\": rpc error: code = NotFound desc = could not find container \"679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5\": container with ID starting with 679f0654ef0d7e698f9c925fc1357a94df9cc49b3d56ca535a1cc916a6fce1d5 not found: ID does not exist" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.299888 4755 scope.go:117] "RemoveContainer" containerID="0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f" Oct 06 08:35:44 crc kubenswrapper[4755]: E1006 08:35:44.300470 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f\": container with ID starting with 0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f not found: ID does not exist" containerID="0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.300578 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f"} err="failed to get container status \"0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f\": rpc error: code = NotFound desc = could not find container \"0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f\": container with ID starting with 0328484b87411d47be0e8e83f97044b7b1013fd48b464c8fdefca7231b46757f not found: ID does not exist" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.300721 4755 scope.go:117] "RemoveContainer" containerID="45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d" Oct 06 08:35:44 crc kubenswrapper[4755]: E1006 08:35:44.301581 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d\": container with ID starting with 45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d not found: ID does not exist" containerID="45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d" Oct 06 08:35:44 crc kubenswrapper[4755]: I1006 08:35:44.301612 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d"} err="failed to get container status \"45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d\": rpc error: code = NotFound desc = could not find container \"45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d\": container with ID starting with 45727e270148de98686fa621de62a33dae5553bd97802108b704520cbe5dc96d not found: ID does not exist" Oct 06 08:35:45 crc kubenswrapper[4755]: I1006 08:35:45.892675 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" path="/var/lib/kubelet/pods/e3c15aea-bec5-47a6-ac2c-20af52549a69/volumes" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.163402 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.237056 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mx66f" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="registry-server" containerID="cri-o://0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e" gracePeriod=2 Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.643353 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.784477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content\") pod \"3b21f87b-1784-4f36-ac14-ef94f7c19755\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.784533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnq8z\" (UniqueName: \"kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z\") pod \"3b21f87b-1784-4f36-ac14-ef94f7c19755\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.784649 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities\") pod \"3b21f87b-1784-4f36-ac14-ef94f7c19755\" (UID: \"3b21f87b-1784-4f36-ac14-ef94f7c19755\") " Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.785556 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities" (OuterVolumeSpecName: "utilities") pod "3b21f87b-1784-4f36-ac14-ef94f7c19755" (UID: "3b21f87b-1784-4f36-ac14-ef94f7c19755"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.789169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z" (OuterVolumeSpecName: "kube-api-access-xnq8z") pod "3b21f87b-1784-4f36-ac14-ef94f7c19755" (UID: "3b21f87b-1784-4f36-ac14-ef94f7c19755"). InnerVolumeSpecName "kube-api-access-xnq8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.851393 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b21f87b-1784-4f36-ac14-ef94f7c19755" (UID: "3b21f87b-1784-4f36-ac14-ef94f7c19755"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.886180 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.886223 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b21f87b-1784-4f36-ac14-ef94f7c19755-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:46 crc kubenswrapper[4755]: I1006 08:35:46.886237 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnq8z\" (UniqueName: \"kubernetes.io/projected/3b21f87b-1784-4f36-ac14-ef94f7c19755-kube-api-access-xnq8z\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.244396 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerID="0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e" exitCode=0 Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.244437 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerDied","Data":"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e"} Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.244451 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx66f" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.244463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx66f" event={"ID":"3b21f87b-1784-4f36-ac14-ef94f7c19755","Type":"ContainerDied","Data":"540c2aa4cf6e6f39e3e943e657cb18542f178ec4dcfeea373b5026671eb3b2db"} Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.244482 4755 scope.go:117] "RemoveContainer" containerID="0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.258537 4755 scope.go:117] "RemoveContainer" containerID="1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.280068 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.283969 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mx66f"] Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.298961 4755 scope.go:117] "RemoveContainer" containerID="0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.312402 4755 scope.go:117] "RemoveContainer" containerID="0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e" Oct 06 08:35:47 crc kubenswrapper[4755]: E1006 08:35:47.312832 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e\": container with ID starting with 0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e not found: ID does not exist" containerID="0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.312868 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e"} err="failed to get container status \"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e\": rpc error: code = NotFound desc = could not find container \"0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e\": container with ID starting with 0f9d2772802a086e26dc8e3c5ec0cf9522acced703843c544c6ed45847e80c5e not found: ID does not exist" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.312892 4755 scope.go:117] "RemoveContainer" containerID="1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88" Oct 06 08:35:47 crc kubenswrapper[4755]: E1006 08:35:47.313181 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88\": container with ID starting with 1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88 not found: ID does not exist" containerID="1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.313207 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88"} err="failed to get container status \"1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88\": rpc error: code = NotFound desc = could not find container \"1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88\": container with ID starting with 1e7aa923622016e1e2d5ef7ab282580423bf490e154515b2163428d1e4704c88 not found: ID does not exist" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.313225 4755 scope.go:117] "RemoveContainer" containerID="0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64" Oct 06 08:35:47 crc kubenswrapper[4755]: E1006 08:35:47.313426 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64\": container with ID starting with 0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64 not found: ID does not exist" containerID="0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.313453 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64"} err="failed to get container status \"0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64\": rpc error: code = NotFound desc = could not find container \"0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64\": container with ID starting with 0cc5b31251a5dc29d740035ca7d95e60b39903587a9208d323b6f0dbd2da3c64 not found: ID does not exist" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.356399 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.356460 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.391831 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:47 crc kubenswrapper[4755]: I1006 08:35:47.890913 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" path="/var/lib/kubelet/pods/3b21f87b-1784-4f36-ac14-ef94f7c19755/volumes" Oct 06 08:35:48 crc kubenswrapper[4755]: I1006 08:35:48.296247 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:48 crc kubenswrapper[4755]: I1006 08:35:48.912498 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:35:48 crc kubenswrapper[4755]: I1006 08:35:48.912601 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:35:51 crc kubenswrapper[4755]: I1006 08:35:51.565275 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:51 crc kubenswrapper[4755]: I1006 08:35:51.565923 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjw25" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="registry-server" containerID="cri-o://2ff4f60cc0f98a99f2da203e84f7005c4b5bd0d1a984bfffde2e860bc5891010" gracePeriod=2 Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.287001 4755 generic.go:334] "Generic (PLEG): container finished" podID="058fad58-c499-43be-816c-4ad22d92f35f" containerID="2ff4f60cc0f98a99f2da203e84f7005c4b5bd0d1a984bfffde2e860bc5891010" exitCode=0 Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.287051 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerDied","Data":"2ff4f60cc0f98a99f2da203e84f7005c4b5bd0d1a984bfffde2e860bc5891010"} Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.483988 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.663669 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities\") pod \"058fad58-c499-43be-816c-4ad22d92f35f\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.663867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgsnj\" (UniqueName: \"kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj\") pod \"058fad58-c499-43be-816c-4ad22d92f35f\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.663939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content\") pod \"058fad58-c499-43be-816c-4ad22d92f35f\" (UID: \"058fad58-c499-43be-816c-4ad22d92f35f\") " Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.664626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities" (OuterVolumeSpecName: "utilities") pod "058fad58-c499-43be-816c-4ad22d92f35f" (UID: "058fad58-c499-43be-816c-4ad22d92f35f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.670599 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj" (OuterVolumeSpecName: "kube-api-access-fgsnj") pod "058fad58-c499-43be-816c-4ad22d92f35f" (UID: "058fad58-c499-43be-816c-4ad22d92f35f"). InnerVolumeSpecName "kube-api-access-fgsnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.718982 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "058fad58-c499-43be-816c-4ad22d92f35f" (UID: "058fad58-c499-43be-816c-4ad22d92f35f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.765542 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.765595 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058fad58-c499-43be-816c-4ad22d92f35f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:52 crc kubenswrapper[4755]: I1006 08:35:52.765605 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgsnj\" (UniqueName: \"kubernetes.io/projected/058fad58-c499-43be-816c-4ad22d92f35f-kube-api-access-fgsnj\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.296391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjw25" event={"ID":"058fad58-c499-43be-816c-4ad22d92f35f","Type":"ContainerDied","Data":"65defc6e06f2fc0b956b406eff5bd32883af6ee0d46cf736dbf5c35098b3f04b"} Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.296475 4755 scope.go:117] "RemoveContainer" containerID="2ff4f60cc0f98a99f2da203e84f7005c4b5bd0d1a984bfffde2e860bc5891010" Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.296475 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjw25" Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.314326 4755 scope.go:117] "RemoveContainer" containerID="134a40303d745633857d9cc2693319d4e838433cd88691b8c50f45bccfa9a844" Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.336917 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.344160 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjw25"] Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.346499 4755 scope.go:117] "RemoveContainer" containerID="76a67a55d99374baf13000d2d9a7f5b57f399435cf35ee612bf293a22d71a155" Oct 06 08:35:53 crc kubenswrapper[4755]: I1006 08:35:53.905165 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058fad58-c499-43be-816c-4ad22d92f35f" path="/var/lib/kubelet/pods/058fad58-c499-43be-816c-4ad22d92f35f/volumes" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.290794 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291889 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291906 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291920 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291928 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291936 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291943 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291952 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291960 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291972 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291978 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.291988 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.291996 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.292016 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292027 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.292040 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292047 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="extract-content" Oct 06 08:35:57 crc kubenswrapper[4755]: E1006 08:35:57.292057 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292064 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="extract-utilities" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292167 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="058fad58-c499-43be-816c-4ad22d92f35f" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292178 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b21f87b-1784-4f36-ac14-ef94f7c19755" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292191 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c15aea-bec5-47a6-ac2c-20af52549a69" containerName="registry-server" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.292966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.306350 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.453380 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bhc\" (UniqueName: \"kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.453466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.453540 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.554702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.554786 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94bhc\" (UniqueName: \"kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.554832 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.555331 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.555345 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.582438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bhc\" (UniqueName: \"kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc\") pod \"redhat-operators-ncmz8\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:57 crc kubenswrapper[4755]: I1006 08:35:57.622624 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:35:58 crc kubenswrapper[4755]: I1006 08:35:58.031858 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:35:58 crc kubenswrapper[4755]: W1006 08:35:58.041362 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f20ab27_b041_4992_8422_f575dc76123f.slice/crio-e68dd67f36ba3608d2ce0ffe609cbbd378f228037828b81a504458dc1545b010 WatchSource:0}: Error finding container e68dd67f36ba3608d2ce0ffe609cbbd378f228037828b81a504458dc1545b010: Status 404 returned error can't find the container with id e68dd67f36ba3608d2ce0ffe609cbbd378f228037828b81a504458dc1545b010 Oct 06 08:35:58 crc kubenswrapper[4755]: I1006 08:35:58.336514 4755 generic.go:334] "Generic (PLEG): container finished" podID="4f20ab27-b041-4992-8422-f575dc76123f" containerID="5d034e19f06172b121e9647bce116ac7dd9c5fc48211abaccb4bfbcc956cce01" exitCode=0 Oct 06 08:35:58 crc kubenswrapper[4755]: I1006 08:35:58.336583 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerDied","Data":"5d034e19f06172b121e9647bce116ac7dd9c5fc48211abaccb4bfbcc956cce01"} Oct 06 08:35:58 crc kubenswrapper[4755]: I1006 08:35:58.336854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerStarted","Data":"e68dd67f36ba3608d2ce0ffe609cbbd378f228037828b81a504458dc1545b010"} Oct 06 08:36:00 crc kubenswrapper[4755]: I1006 08:36:00.352068 4755 generic.go:334] "Generic (PLEG): container finished" podID="4f20ab27-b041-4992-8422-f575dc76123f" containerID="9438887d50183d51f94d4c63b37e440ea4c502310d606239fdc435ce0584a9f7" exitCode=0 Oct 06 08:36:00 crc kubenswrapper[4755]: I1006 08:36:00.352173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerDied","Data":"9438887d50183d51f94d4c63b37e440ea4c502310d606239fdc435ce0584a9f7"} Oct 06 08:36:01 crc kubenswrapper[4755]: I1006 08:36:01.361997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerStarted","Data":"e2a6062e08b153707621e7bef5084535e50ecf965a2fb15211d1560f6025e902"} Oct 06 08:36:01 crc kubenswrapper[4755]: I1006 08:36:01.383152 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ncmz8" podStartSLOduration=1.903595382 podStartE2EDuration="4.383127328s" podCreationTimestamp="2025-10-06 08:35:57 +0000 UTC" firstStartedPulling="2025-10-06 08:35:58.338606409 +0000 UTC m=+815.167921623" lastFinishedPulling="2025-10-06 08:36:00.818138355 +0000 UTC m=+817.647453569" observedRunningTime="2025-10-06 08:36:01.380719129 +0000 UTC m=+818.210034343" watchObservedRunningTime="2025-10-06 08:36:01.383127328 +0000 UTC m=+818.212442582" Oct 06 08:36:07 crc kubenswrapper[4755]: I1006 08:36:07.622972 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:07 crc kubenswrapper[4755]: I1006 08:36:07.625059 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:07 crc kubenswrapper[4755]: I1006 08:36:07.659660 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:08 crc kubenswrapper[4755]: I1006 08:36:08.459552 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:08 crc kubenswrapper[4755]: I1006 08:36:08.504828 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:36:10 crc kubenswrapper[4755]: I1006 08:36:10.419674 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ncmz8" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="registry-server" containerID="cri-o://e2a6062e08b153707621e7bef5084535e50ecf965a2fb15211d1560f6025e902" gracePeriod=2 Oct 06 08:36:11 crc kubenswrapper[4755]: I1006 08:36:11.430557 4755 generic.go:334] "Generic (PLEG): container finished" podID="4f20ab27-b041-4992-8422-f575dc76123f" containerID="e2a6062e08b153707621e7bef5084535e50ecf965a2fb15211d1560f6025e902" exitCode=0 Oct 06 08:36:11 crc kubenswrapper[4755]: I1006 08:36:11.431680 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerDied","Data":"e2a6062e08b153707621e7bef5084535e50ecf965a2fb15211d1560f6025e902"} Oct 06 08:36:11 crc kubenswrapper[4755]: I1006 08:36:11.935478 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.053410 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities\") pod \"4f20ab27-b041-4992-8422-f575dc76123f\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.053796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94bhc\" (UniqueName: \"kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc\") pod \"4f20ab27-b041-4992-8422-f575dc76123f\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.053910 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content\") pod \"4f20ab27-b041-4992-8422-f575dc76123f\" (UID: \"4f20ab27-b041-4992-8422-f575dc76123f\") " Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.054302 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities" (OuterVolumeSpecName: "utilities") pod "4f20ab27-b041-4992-8422-f575dc76123f" (UID: "4f20ab27-b041-4992-8422-f575dc76123f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.058643 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc" (OuterVolumeSpecName: "kube-api-access-94bhc") pod "4f20ab27-b041-4992-8422-f575dc76123f" (UID: "4f20ab27-b041-4992-8422-f575dc76123f"). InnerVolumeSpecName "kube-api-access-94bhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.135233 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f20ab27-b041-4992-8422-f575dc76123f" (UID: "4f20ab27-b041-4992-8422-f575dc76123f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.155302 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.155331 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f20ab27-b041-4992-8422-f575dc76123f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.155341 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94bhc\" (UniqueName: \"kubernetes.io/projected/4f20ab27-b041-4992-8422-f575dc76123f-kube-api-access-94bhc\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.438327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncmz8" event={"ID":"4f20ab27-b041-4992-8422-f575dc76123f","Type":"ContainerDied","Data":"e68dd67f36ba3608d2ce0ffe609cbbd378f228037828b81a504458dc1545b010"} Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.438382 4755 scope.go:117] "RemoveContainer" containerID="e2a6062e08b153707621e7bef5084535e50ecf965a2fb15211d1560f6025e902" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.438382 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncmz8" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.453227 4755 scope.go:117] "RemoveContainer" containerID="9438887d50183d51f94d4c63b37e440ea4c502310d606239fdc435ce0584a9f7" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.470984 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.474494 4755 scope.go:117] "RemoveContainer" containerID="5d034e19f06172b121e9647bce116ac7dd9c5fc48211abaccb4bfbcc956cce01" Oct 06 08:36:12 crc kubenswrapper[4755]: I1006 08:36:12.479023 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ncmz8"] Oct 06 08:36:13 crc kubenswrapper[4755]: I1006 08:36:13.885881 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f20ab27-b041-4992-8422-f575dc76123f" path="/var/lib/kubelet/pods/4f20ab27-b041-4992-8422-f575dc76123f/volumes" Oct 06 08:36:18 crc kubenswrapper[4755]: I1006 08:36:18.912501 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:36:18 crc kubenswrapper[4755]: I1006 08:36:18.913211 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:36:18 crc kubenswrapper[4755]: I1006 08:36:18.913281 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:36:18 crc kubenswrapper[4755]: I1006 08:36:18.913962 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:36:18 crc kubenswrapper[4755]: I1006 08:36:18.914022 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec" gracePeriod=600 Oct 06 08:36:19 crc kubenswrapper[4755]: I1006 08:36:19.501366 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec" exitCode=0 Oct 06 08:36:19 crc kubenswrapper[4755]: I1006 08:36:19.501441 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec"} Oct 06 08:36:19 crc kubenswrapper[4755]: I1006 08:36:19.501768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5"} Oct 06 08:36:19 crc kubenswrapper[4755]: I1006 08:36:19.501806 4755 scope.go:117] "RemoveContainer" containerID="d91d9012e478d7f838adb567aaf83be7e24217db74ea1547bb0d299bd1231bbd" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.181954 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n"] Oct 06 08:36:20 crc kubenswrapper[4755]: E1006 08:36:20.182494 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="registry-server" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.182506 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="registry-server" Oct 06 08:36:20 crc kubenswrapper[4755]: E1006 08:36:20.182520 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="extract-utilities" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.182526 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="extract-utilities" Oct 06 08:36:20 crc kubenswrapper[4755]: E1006 08:36:20.182535 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="extract-content" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.182541 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="extract-content" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.182677 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f20ab27-b041-4992-8422-f575dc76123f" containerName="registry-server" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.183280 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.186146 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nt7db" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.195447 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.215682 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.217476 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.219381 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dzck6" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.223290 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.226795 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.226946 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.228117 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.230871 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-knqgq" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.231206 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-w9jfs" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.234654 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.244433 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.270021 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.271359 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.273491 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.274498 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7xfjz" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.296172 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.297369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.302617 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.309197 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qsfj4" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.321478 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.355824 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.356864 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.363146 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wmhxs" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.364243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftns\" (UniqueName: \"kubernetes.io/projected/1412ad22-876d-4924-9f9b-468970063426-kube-api-access-fftns\") pod \"designate-operator-controller-manager-75dfd9b554-zs28l\" (UID: \"1412ad22-876d-4924-9f9b-468970063426\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.364294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkb8n\" (UniqueName: \"kubernetes.io/projected/2c9d42bf-9896-4198-aeb9-352d080978d0-kube-api-access-rkb8n\") pod \"heat-operator-controller-manager-8f58bc9db-jnt7p\" (UID: \"2c9d42bf-9896-4198-aeb9-352d080978d0\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.364322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9p54\" (UniqueName: \"kubernetes.io/projected/5e7b409c-75ff-43bf-87e1-9a7877dd21f3-kube-api-access-g9p54\") pod \"cinder-operator-controller-manager-78fdc95566-rdwj9\" (UID: \"5e7b409c-75ff-43bf-87e1-9a7877dd21f3\") " pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.364344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szwdj\" (UniqueName: \"kubernetes.io/projected/169c4b1e-417b-4ddf-9886-6c4668257712-kube-api-access-szwdj\") pod \"glance-operator-controller-manager-5568b5d68-w6rtg\" (UID: \"169c4b1e-417b-4ddf-9886-6c4668257712\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.364370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8hv\" (UniqueName: \"kubernetes.io/projected/cc16a1c7-6450-414d-9e4b-518014071887-kube-api-access-xr8hv\") pod \"barbican-operator-controller-manager-5f7c849b98-lbg2n\" (UID: \"cc16a1c7-6450-414d-9e4b-518014071887\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.369257 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.371093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.374031 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-v7hxx" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.374258 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.380637 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.403167 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.408546 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.409610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.413257 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6zjwj" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.441073 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.452877 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.454148 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.459918 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-679hv" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.460246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-cert\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftns\" (UniqueName: \"kubernetes.io/projected/1412ad22-876d-4924-9f9b-468970063426-kube-api-access-fftns\") pod \"designate-operator-controller-manager-75dfd9b554-zs28l\" (UID: \"1412ad22-876d-4924-9f9b-468970063426\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465399 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zf6\" (UniqueName: \"kubernetes.io/projected/d33c2461-9722-468a-b6e4-20b4ce822f18-kube-api-access-94zf6\") pod \"horizon-operator-controller-manager-54876c876f-9qkff\" (UID: \"d33c2461-9722-468a-b6e4-20b4ce822f18\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkb8n\" (UniqueName: \"kubernetes.io/projected/2c9d42bf-9896-4198-aeb9-352d080978d0-kube-api-access-rkb8n\") pod \"heat-operator-controller-manager-8f58bc9db-jnt7p\" (UID: \"2c9d42bf-9896-4198-aeb9-352d080978d0\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqwc\" (UniqueName: \"kubernetes.io/projected/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-kube-api-access-mqqwc\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9p54\" (UniqueName: \"kubernetes.io/projected/5e7b409c-75ff-43bf-87e1-9a7877dd21f3-kube-api-access-g9p54\") pod \"cinder-operator-controller-manager-78fdc95566-rdwj9\" (UID: \"5e7b409c-75ff-43bf-87e1-9a7877dd21f3\") " pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szwdj\" (UniqueName: \"kubernetes.io/projected/169c4b1e-417b-4ddf-9886-6c4668257712-kube-api-access-szwdj\") pod \"glance-operator-controller-manager-5568b5d68-w6rtg\" (UID: \"169c4b1e-417b-4ddf-9886-6c4668257712\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465537 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc9dx\" (UniqueName: \"kubernetes.io/projected/206b28f4-45a9-4352-bb98-717c408dfcac-kube-api-access-wc9dx\") pod \"ironic-operator-controller-manager-699b87f775-ld9kw\" (UID: \"206b28f4-45a9-4352-bb98-717c408dfcac\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.465557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8hv\" (UniqueName: \"kubernetes.io/projected/cc16a1c7-6450-414d-9e4b-518014071887-kube-api-access-xr8hv\") pod \"barbican-operator-controller-manager-5f7c849b98-lbg2n\" (UID: \"cc16a1c7-6450-414d-9e4b-518014071887\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.487992 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.489238 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.500694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-sjb8l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.501009 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkb8n\" (UniqueName: \"kubernetes.io/projected/2c9d42bf-9896-4198-aeb9-352d080978d0-kube-api-access-rkb8n\") pod \"heat-operator-controller-manager-8f58bc9db-jnt7p\" (UID: \"2c9d42bf-9896-4198-aeb9-352d080978d0\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.506192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szwdj\" (UniqueName: \"kubernetes.io/projected/169c4b1e-417b-4ddf-9886-6c4668257712-kube-api-access-szwdj\") pod \"glance-operator-controller-manager-5568b5d68-w6rtg\" (UID: \"169c4b1e-417b-4ddf-9886-6c4668257712\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.510443 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9p54\" (UniqueName: \"kubernetes.io/projected/5e7b409c-75ff-43bf-87e1-9a7877dd21f3-kube-api-access-g9p54\") pod \"cinder-operator-controller-manager-78fdc95566-rdwj9\" (UID: \"5e7b409c-75ff-43bf-87e1-9a7877dd21f3\") " pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.520261 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftns\" (UniqueName: \"kubernetes.io/projected/1412ad22-876d-4924-9f9b-468970063426-kube-api-access-fftns\") pod \"designate-operator-controller-manager-75dfd9b554-zs28l\" (UID: \"1412ad22-876d-4924-9f9b-468970063426\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.526449 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8hv\" (UniqueName: \"kubernetes.io/projected/cc16a1c7-6450-414d-9e4b-518014071887-kube-api-access-xr8hv\") pod \"barbican-operator-controller-manager-5f7c849b98-lbg2n\" (UID: \"cc16a1c7-6450-414d-9e4b-518014071887\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.538344 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.541910 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.543273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.564223 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-swd49" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.564631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.567544 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc9dx\" (UniqueName: \"kubernetes.io/projected/206b28f4-45a9-4352-bb98-717c408dfcac-kube-api-access-wc9dx\") pod \"ironic-operator-controller-manager-699b87f775-ld9kw\" (UID: \"206b28f4-45a9-4352-bb98-717c408dfcac\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.567666 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhqk\" (UniqueName: \"kubernetes.io/projected/a151d352-3084-45b5-80b6-48510de0f087-kube-api-access-jdhqk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q\" (UID: \"a151d352-3084-45b5-80b6-48510de0f087\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.567766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-cert\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.567846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4x2\" (UniqueName: \"kubernetes.io/projected/26edd385-18c0-41cf-8094-e1844f07364a-kube-api-access-4g4x2\") pod \"manila-operator-controller-manager-65d89cfd9f-zkdrt\" (UID: \"26edd385-18c0-41cf-8094-e1844f07364a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.567926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zf6\" (UniqueName: \"kubernetes.io/projected/d33c2461-9722-468a-b6e4-20b4ce822f18-kube-api-access-94zf6\") pod \"horizon-operator-controller-manager-54876c876f-9qkff\" (UID: \"d33c2461-9722-468a-b6e4-20b4ce822f18\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.568027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqwc\" (UniqueName: \"kubernetes.io/projected/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-kube-api-access-mqqwc\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.568122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhh4\" (UniqueName: \"kubernetes.io/projected/5fbfa495-f4d7-4bf8-a489-f8d24476fbf2-kube-api-access-rnhh4\") pod \"keystone-operator-controller-manager-655d88ccb9-6kmrp\" (UID: \"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.599051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.599065 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-cert\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.601344 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.621416 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.636494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqwc\" (UniqueName: \"kubernetes.io/projected/cc16e4a5-7b17-4f64-840e-1d0f6971c7a4-kube-api-access-mqqwc\") pod \"infra-operator-controller-manager-658588b8c9-6nqcm\" (UID: \"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.645189 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.648961 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.650680 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.655339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc9dx\" (UniqueName: \"kubernetes.io/projected/206b28f4-45a9-4352-bb98-717c408dfcac-kube-api-access-wc9dx\") pod \"ironic-operator-controller-manager-699b87f775-ld9kw\" (UID: \"206b28f4-45a9-4352-bb98-717c408dfcac\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.656282 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zh5ff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.657045 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zf6\" (UniqueName: \"kubernetes.io/projected/d33c2461-9722-468a-b6e4-20b4ce822f18-kube-api-access-94zf6\") pod \"horizon-operator-controller-manager-54876c876f-9qkff\" (UID: \"d33c2461-9722-468a-b6e4-20b4ce822f18\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.662717 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.663002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.681071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhh4\" (UniqueName: \"kubernetes.io/projected/5fbfa495-f4d7-4bf8-a489-f8d24476fbf2-kube-api-access-rnhh4\") pod \"keystone-operator-controller-manager-655d88ccb9-6kmrp\" (UID: \"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.691934 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.692779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhqk\" (UniqueName: \"kubernetes.io/projected/a151d352-3084-45b5-80b6-48510de0f087-kube-api-access-jdhqk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q\" (UID: \"a151d352-3084-45b5-80b6-48510de0f087\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.693006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwxhn\" (UniqueName: \"kubernetes.io/projected/fa923997-ddf0-4e0a-9ef9-bde22b553dfb-kube-api-access-hwxhn\") pod \"neutron-operator-controller-manager-8d984cc4d-6cfbt\" (UID: \"fa923997-ddf0-4e0a-9ef9-bde22b553dfb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.693184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4x2\" (UniqueName: \"kubernetes.io/projected/26edd385-18c0-41cf-8094-e1844f07364a-kube-api-access-4g4x2\") pod \"manila-operator-controller-manager-65d89cfd9f-zkdrt\" (UID: \"26edd385-18c0-41cf-8094-e1844f07364a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.703337 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.704529 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhh4\" (UniqueName: \"kubernetes.io/projected/5fbfa495-f4d7-4bf8-a489-f8d24476fbf2-kube-api-access-rnhh4\") pod \"keystone-operator-controller-manager-655d88ccb9-6kmrp\" (UID: \"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.705850 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.709753 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.712654 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4x2\" (UniqueName: \"kubernetes.io/projected/26edd385-18c0-41cf-8094-e1844f07364a-kube-api-access-4g4x2\") pod \"manila-operator-controller-manager-65d89cfd9f-zkdrt\" (UID: \"26edd385-18c0-41cf-8094-e1844f07364a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.713573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhqk\" (UniqueName: \"kubernetes.io/projected/a151d352-3084-45b5-80b6-48510de0f087-kube-api-access-jdhqk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q\" (UID: \"a151d352-3084-45b5-80b6-48510de0f087\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.713890 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.715379 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qnww9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.723343 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.724531 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.727831 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4rrpz" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.739586 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.746848 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.748507 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.753117 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-555f5"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.753250 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.753357 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l9jhq" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.764955 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-555f5"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.765002 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.765136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.774146 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-r4ftp" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.779698 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.787010 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.789807 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.790951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.794192 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2lpc5" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.795508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdlg\" (UniqueName: \"kubernetes.io/projected/771c3503-d156-46de-81b3-fe3845ecd58f-kube-api-access-fgdlg\") pod \"nova-operator-controller-manager-7c7fc454ff-nk299\" (UID: \"771c3503-d156-46de-81b3-fe3845ecd58f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.795687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmr4\" (UniqueName: \"kubernetes.io/projected/188c5ff1-ba40-4c30-b411-d5beb8cdb4e8-kube-api-access-zcmr4\") pod \"octavia-operator-controller-manager-7468f855d8-k5658\" (UID: \"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.795775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66jv9\" (UniqueName: \"kubernetes.io/projected/96e87134-c1a1-49fb-9e05-59e10699741f-kube-api-access-66jv9\") pod \"ovn-operator-controller-manager-579449c7d5-gpblg\" (UID: \"96e87134-c1a1-49fb-9e05-59e10699741f\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.795879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwxhn\" (UniqueName: \"kubernetes.io/projected/fa923997-ddf0-4e0a-9ef9-bde22b553dfb-kube-api-access-hwxhn\") pod \"neutron-operator-controller-manager-8d984cc4d-6cfbt\" (UID: \"fa923997-ddf0-4e0a-9ef9-bde22b553dfb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.807895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.825755 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.829822 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.831552 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-r8wgf" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.834276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwxhn\" (UniqueName: \"kubernetes.io/projected/fa923997-ddf0-4e0a-9ef9-bde22b553dfb-kube-api-access-hwxhn\") pod \"neutron-operator-controller-manager-8d984cc4d-6cfbt\" (UID: \"fa923997-ddf0-4e0a-9ef9-bde22b553dfb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.858842 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.862714 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.897190 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.897874 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdlg\" (UniqueName: \"kubernetes.io/projected/771c3503-d156-46de-81b3-fe3845ecd58f-kube-api-access-fgdlg\") pod \"nova-operator-controller-manager-7c7fc454ff-nk299\" (UID: \"771c3503-d156-46de-81b3-fe3845ecd58f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.897925 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8ct\" (UniqueName: \"kubernetes.io/projected/1a519b4b-8c97-4154-b87c-2cbd91e4453b-kube-api-access-hl8ct\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.897956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xzx\" (UniqueName: \"kubernetes.io/projected/57a14562-fc08-4785-a24c-ead1cb0919e6-kube-api-access-82xzx\") pod \"telemetry-operator-controller-manager-5d4d74dd89-n6gl6\" (UID: \"57a14562-fc08-4785-a24c-ead1cb0919e6\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.897998 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.898023 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6mp\" (UniqueName: \"kubernetes.io/projected/746ef71e-2879-4789-879d-a4479700346e-kube-api-access-pv6mp\") pod \"placement-operator-controller-manager-54689d9f88-555f5\" (UID: \"746ef71e-2879-4789-879d-a4479700346e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.898049 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmc4\" (UniqueName: \"kubernetes.io/projected/c0fe017f-b521-4146-a2fe-7b790d585e22-kube-api-access-qjmc4\") pod \"swift-operator-controller-manager-6859f9b676-dbmlb\" (UID: \"c0fe017f-b521-4146-a2fe-7b790d585e22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.898097 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmr4\" (UniqueName: \"kubernetes.io/projected/188c5ff1-ba40-4c30-b411-d5beb8cdb4e8-kube-api-access-zcmr4\") pod \"octavia-operator-controller-manager-7468f855d8-k5658\" (UID: \"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.898139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66jv9\" (UniqueName: \"kubernetes.io/projected/96e87134-c1a1-49fb-9e05-59e10699741f-kube-api-access-66jv9\") pod \"ovn-operator-controller-manager-579449c7d5-gpblg\" (UID: \"96e87134-c1a1-49fb-9e05-59e10699741f\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.900772 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.903951 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-dj55b" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.922930 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.944396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmr4\" (UniqueName: \"kubernetes.io/projected/188c5ff1-ba40-4c30-b411-d5beb8cdb4e8-kube-api-access-zcmr4\") pod \"octavia-operator-controller-manager-7468f855d8-k5658\" (UID: \"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.956019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66jv9\" (UniqueName: \"kubernetes.io/projected/96e87134-c1a1-49fb-9e05-59e10699741f-kube-api-access-66jv9\") pod \"ovn-operator-controller-manager-579449c7d5-gpblg\" (UID: \"96e87134-c1a1-49fb-9e05-59e10699741f\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.956353 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdlg\" (UniqueName: \"kubernetes.io/projected/771c3503-d156-46de-81b3-fe3845ecd58f-kube-api-access-fgdlg\") pod \"nova-operator-controller-manager-7c7fc454ff-nk299\" (UID: \"771c3503-d156-46de-81b3-fe3845ecd58f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.978945 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw"] Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.979133 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.980159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.984717 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hcxcc" Oct 06 08:36:20 crc kubenswrapper[4755]: I1006 08:36:20.995793 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003306 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8ct\" (UniqueName: \"kubernetes.io/projected/1a519b4b-8c97-4154-b87c-2cbd91e4453b-kube-api-access-hl8ct\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003363 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xzx\" (UniqueName: \"kubernetes.io/projected/57a14562-fc08-4785-a24c-ead1cb0919e6-kube-api-access-82xzx\") pod \"telemetry-operator-controller-manager-5d4d74dd89-n6gl6\" (UID: \"57a14562-fc08-4785-a24c-ead1cb0919e6\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8t4\" (UniqueName: \"kubernetes.io/projected/73a255d0-1e8a-43c9-b27c-e7ff650c3c79-kube-api-access-5h8t4\") pod \"test-operator-controller-manager-5cd5cb47d7-8sf8c\" (UID: \"73a255d0-1e8a-43c9-b27c-e7ff650c3c79\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003433 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6mp\" (UniqueName: \"kubernetes.io/projected/746ef71e-2879-4789-879d-a4479700346e-kube-api-access-pv6mp\") pod \"placement-operator-controller-manager-54689d9f88-555f5\" (UID: \"746ef71e-2879-4789-879d-a4479700346e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.003482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmc4\" (UniqueName: \"kubernetes.io/projected/c0fe017f-b521-4146-a2fe-7b790d585e22-kube-api-access-qjmc4\") pod \"swift-operator-controller-manager-6859f9b676-dbmlb\" (UID: \"c0fe017f-b521-4146-a2fe-7b790d585e22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.004086 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:21 crc kubenswrapper[4755]: E1006 08:36:21.004970 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:21 crc kubenswrapper[4755]: E1006 08:36:21.005013 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert podName:1a519b4b-8c97-4154-b87c-2cbd91e4453b nodeName:}" failed. No retries permitted until 2025-10-06 08:36:21.504996287 +0000 UTC m=+838.334311491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" (UID: "1a519b4b-8c97-4154-b87c-2cbd91e4453b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.016984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.039420 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xzx\" (UniqueName: \"kubernetes.io/projected/57a14562-fc08-4785-a24c-ead1cb0919e6-kube-api-access-82xzx\") pod \"telemetry-operator-controller-manager-5d4d74dd89-n6gl6\" (UID: \"57a14562-fc08-4785-a24c-ead1cb0919e6\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.040975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.042797 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8ct\" (UniqueName: \"kubernetes.io/projected/1a519b4b-8c97-4154-b87c-2cbd91e4453b-kube-api-access-hl8ct\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.053555 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmc4\" (UniqueName: \"kubernetes.io/projected/c0fe017f-b521-4146-a2fe-7b790d585e22-kube-api-access-qjmc4\") pod \"swift-operator-controller-manager-6859f9b676-dbmlb\" (UID: \"c0fe017f-b521-4146-a2fe-7b790d585e22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.057486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6mp\" (UniqueName: \"kubernetes.io/projected/746ef71e-2879-4789-879d-a4479700346e-kube-api-access-pv6mp\") pod \"placement-operator-controller-manager-54689d9f88-555f5\" (UID: \"746ef71e-2879-4789-879d-a4479700346e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.060063 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.078672 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.079867 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.088744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5fdlp" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.089029 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.095621 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.100684 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.115553 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.128696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8t4\" (UniqueName: \"kubernetes.io/projected/73a255d0-1e8a-43c9-b27c-e7ff650c3c79-kube-api-access-5h8t4\") pod \"test-operator-controller-manager-5cd5cb47d7-8sf8c\" (UID: \"73a255d0-1e8a-43c9-b27c-e7ff650c3c79\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.128834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp77d\" (UniqueName: \"kubernetes.io/projected/51cb4df4-8c6a-4563-bd07-9c05182b4216-kube-api-access-xp77d\") pod \"watcher-operator-controller-manager-6cbc6dd547-gjkcw\" (UID: \"51cb4df4-8c6a-4563-bd07-9c05182b4216\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.132879 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9jsg6" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.135866 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.157902 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.160003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.171005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8t4\" (UniqueName: \"kubernetes.io/projected/73a255d0-1e8a-43c9-b27c-e7ff650c3c79-kube-api-access-5h8t4\") pod \"test-operator-controller-manager-5cd5cb47d7-8sf8c\" (UID: \"73a255d0-1e8a-43c9-b27c-e7ff650c3c79\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.177820 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.230398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f27028-c7a1-4bee-bb82-41c4b0354da1-cert\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.230487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mz2f\" (UniqueName: \"kubernetes.io/projected/1091a8d9-172e-4016-b354-16329cdab528-kube-api-access-8mz2f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb\" (UID: \"1091a8d9-172e-4016-b354-16329cdab528\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.230605 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8nb5\" (UniqueName: \"kubernetes.io/projected/f9f27028-c7a1-4bee-bb82-41c4b0354da1-kube-api-access-r8nb5\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.230643 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp77d\" (UniqueName: \"kubernetes.io/projected/51cb4df4-8c6a-4563-bd07-9c05182b4216-kube-api-access-xp77d\") pod \"watcher-operator-controller-manager-6cbc6dd547-gjkcw\" (UID: \"51cb4df4-8c6a-4563-bd07-9c05182b4216\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.240540 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.268696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp77d\" (UniqueName: \"kubernetes.io/projected/51cb4df4-8c6a-4563-bd07-9c05182b4216-kube-api-access-xp77d\") pod \"watcher-operator-controller-manager-6cbc6dd547-gjkcw\" (UID: \"51cb4df4-8c6a-4563-bd07-9c05182b4216\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.272940 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.331522 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mz2f\" (UniqueName: \"kubernetes.io/projected/1091a8d9-172e-4016-b354-16329cdab528-kube-api-access-8mz2f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb\" (UID: \"1091a8d9-172e-4016-b354-16329cdab528\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.331601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nb5\" (UniqueName: \"kubernetes.io/projected/f9f27028-c7a1-4bee-bb82-41c4b0354da1-kube-api-access-r8nb5\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.331658 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f27028-c7a1-4bee-bb82-41c4b0354da1-cert\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.340214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f27028-c7a1-4bee-bb82-41c4b0354da1-cert\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.349017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mz2f\" (UniqueName: \"kubernetes.io/projected/1091a8d9-172e-4016-b354-16329cdab528-kube-api-access-8mz2f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb\" (UID: \"1091a8d9-172e-4016-b354-16329cdab528\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.350756 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8nb5\" (UniqueName: \"kubernetes.io/projected/f9f27028-c7a1-4bee-bb82-41c4b0354da1-kube-api-access-r8nb5\") pod \"openstack-operator-controller-manager-7f9f8b87ff-hgpl2\" (UID: \"f9f27028-c7a1-4bee-bb82-41c4b0354da1\") " pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.360140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.533715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:21 crc kubenswrapper[4755]: E1006 08:36:21.533916 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:21 crc kubenswrapper[4755]: E1006 08:36:21.533999 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert podName:1a519b4b-8c97-4154-b87c-2cbd91e4453b nodeName:}" failed. No retries permitted until 2025-10-06 08:36:22.533980045 +0000 UTC m=+839.363295249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" (UID: "1a519b4b-8c97-4154-b87c-2cbd91e4453b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.533926 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.558464 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" event={"ID":"5e7b409c-75ff-43bf-87e1-9a7877dd21f3","Type":"ContainerStarted","Data":"2cb93881504613336e6e88696ad94641efade0834ed35ddaa6ce36a48e1d16d4"} Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.564178 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.780953 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.825273 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l"] Oct 06 08:36:21 crc kubenswrapper[4755]: I1006 08:36:21.832416 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p"] Oct 06 08:36:21 crc kubenswrapper[4755]: W1006 08:36:21.870686 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c9d42bf_9896_4198_aeb9_352d080978d0.slice/crio-3d35e8a310152c5fedf66333ead55ec2731caccf949345e6fa03eac7aa94c274 WatchSource:0}: Error finding container 3d35e8a310152c5fedf66333ead55ec2731caccf949345e6fa03eac7aa94c274: Status 404 returned error can't find the container with id 3d35e8a310152c5fedf66333ead55ec2731caccf949345e6fa03eac7aa94c274 Oct 06 08:36:21 crc kubenswrapper[4755]: W1006 08:36:21.871022 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33c2461_9722_468a_b6e4_20b4ce822f18.slice/crio-e070c0aa8d3c62b7edf71c8e4c8e0d51b76561c93422b40252156e6c05273999 WatchSource:0}: Error finding container e070c0aa8d3c62b7edf71c8e4c8e0d51b76561c93422b40252156e6c05273999: Status 404 returned error can't find the container with id e070c0aa8d3c62b7edf71c8e4c8e0d51b76561c93422b40252156e6c05273999 Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.171234 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.177129 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.552117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.561389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a519b4b-8c97-4154-b87c-2cbd91e4453b-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9\" (UID: \"1a519b4b-8c97-4154-b87c-2cbd91e4453b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.567690 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" event={"ID":"d33c2461-9722-468a-b6e4-20b4ce822f18","Type":"ContainerStarted","Data":"e070c0aa8d3c62b7edf71c8e4c8e0d51b76561c93422b40252156e6c05273999"} Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.571535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" event={"ID":"206b28f4-45a9-4352-bb98-717c408dfcac","Type":"ContainerStarted","Data":"6b147ee1e5386ee4ba11ffbf996fb02204620f77eebd4bbd972eb4f01b8f5708"} Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.572969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" event={"ID":"2c9d42bf-9896-4198-aeb9-352d080978d0","Type":"ContainerStarted","Data":"3d35e8a310152c5fedf66333ead55ec2731caccf949345e6fa03eac7aa94c274"} Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.573895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" event={"ID":"169c4b1e-417b-4ddf-9886-6c4668257712","Type":"ContainerStarted","Data":"2b259a0c9c86cb722e7cd5e0abbe6f0b1bf340bb58d3afb170c0b3c340dfc9bb"} Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.574799 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" event={"ID":"1412ad22-876d-4924-9f9b-468970063426","Type":"ContainerStarted","Data":"b8963a79a95f4b08fece940709ec14fd876056b633d4e80bb46a986e78e28041"} Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.607013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.800517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.817696 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.827675 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.843658 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-555f5"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.845696 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.855843 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.873672 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.877717 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.881831 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.895446 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.900755 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.906451 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.925540 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.934460 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.939588 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg"] Oct 06 08:36:22 crc kubenswrapper[4755]: I1006 08:36:22.942878 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2"] Oct 06 08:36:23 crc kubenswrapper[4755]: W1006 08:36:23.517804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0fe017f_b521_4146_a2fe_7b790d585e22.slice/crio-adf814dfdceb3effc5f869c4efc23d5dbedce8616d67d0ef92b78e0cde90b7da WatchSource:0}: Error finding container adf814dfdceb3effc5f869c4efc23d5dbedce8616d67d0ef92b78e0cde90b7da: Status 404 returned error can't find the container with id adf814dfdceb3effc5f869c4efc23d5dbedce8616d67d0ef92b78e0cde90b7da Oct 06 08:36:23 crc kubenswrapper[4755]: W1006 08:36:23.550294 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f27028_c7a1_4bee_bb82_41c4b0354da1.slice/crio-b3a36e5e385063ea51fcaa94ba3d4746c8240b655bf379ceb789edcf1f0bf715 WatchSource:0}: Error finding container b3a36e5e385063ea51fcaa94ba3d4746c8240b655bf379ceb789edcf1f0bf715: Status 404 returned error can't find the container with id b3a36e5e385063ea51fcaa94ba3d4746c8240b655bf379ceb789edcf1f0bf715 Oct 06 08:36:23 crc kubenswrapper[4755]: I1006 08:36:23.588454 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" event={"ID":"f9f27028-c7a1-4bee-bb82-41c4b0354da1","Type":"ContainerStarted","Data":"b3a36e5e385063ea51fcaa94ba3d4746c8240b655bf379ceb789edcf1f0bf715"} Oct 06 08:36:23 crc kubenswrapper[4755]: I1006 08:36:23.589285 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" event={"ID":"51cb4df4-8c6a-4563-bd07-9c05182b4216","Type":"ContainerStarted","Data":"b934fd8be10b3b5f298469edf6e26893acaa98adc3794efdb11cc11b9e3ff43d"} Oct 06 08:36:23 crc kubenswrapper[4755]: I1006 08:36:23.590735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" event={"ID":"c0fe017f-b521-4146-a2fe-7b790d585e22","Type":"ContainerStarted","Data":"adf814dfdceb3effc5f869c4efc23d5dbedce8616d67d0ef92b78e0cde90b7da"} Oct 06 08:36:24 crc kubenswrapper[4755]: W1006 08:36:24.518608 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771c3503_d156_46de_81b3_fe3845ecd58f.slice/crio-1f142f5a0c5659c66b95f6108bf03b6272db5f8128c42183099feffad05fbd4d WatchSource:0}: Error finding container 1f142f5a0c5659c66b95f6108bf03b6272db5f8128c42183099feffad05fbd4d: Status 404 returned error can't find the container with id 1f142f5a0c5659c66b95f6108bf03b6272db5f8128c42183099feffad05fbd4d Oct 06 08:36:24 crc kubenswrapper[4755]: W1006 08:36:24.576043 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e87134_c1a1_49fb_9e05_59e10699741f.slice/crio-471edecf4837463096b04672b7517280e4de9aef108da1357b3f1d7bc1caf6bb WatchSource:0}: Error finding container 471edecf4837463096b04672b7517280e4de9aef108da1357b3f1d7bc1caf6bb: Status 404 returned error can't find the container with id 471edecf4837463096b04672b7517280e4de9aef108da1357b3f1d7bc1caf6bb Oct 06 08:36:24 crc kubenswrapper[4755]: W1006 08:36:24.583422 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fbfa495_f4d7_4bf8_a489_f8d24476fbf2.slice/crio-3e68dd6a90eb4da6d2188af586acf2096b2fb7dff1079ded080d27443d7eed03 WatchSource:0}: Error finding container 3e68dd6a90eb4da6d2188af586acf2096b2fb7dff1079ded080d27443d7eed03: Status 404 returned error can't find the container with id 3e68dd6a90eb4da6d2188af586acf2096b2fb7dff1079ded080d27443d7eed03 Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.606268 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" event={"ID":"771c3503-d156-46de-81b3-fe3845ecd58f","Type":"ContainerStarted","Data":"1f142f5a0c5659c66b95f6108bf03b6272db5f8128c42183099feffad05fbd4d"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.607482 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" event={"ID":"73a255d0-1e8a-43c9-b27c-e7ff650c3c79","Type":"ContainerStarted","Data":"41ed33e88a36169521f888fded88541eaee5e6a11ea2868e2f040635cdd8c759"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.609507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" event={"ID":"fa923997-ddf0-4e0a-9ef9-bde22b553dfb","Type":"ContainerStarted","Data":"43375c3c375c996f00da45744c4abe637901afe6385fec5334ddc713ae6b2e29"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.610721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" event={"ID":"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2","Type":"ContainerStarted","Data":"3e68dd6a90eb4da6d2188af586acf2096b2fb7dff1079ded080d27443d7eed03"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.611926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" event={"ID":"26edd385-18c0-41cf-8094-e1844f07364a","Type":"ContainerStarted","Data":"c7e0979fd6e74d3bd0ad6f286a3f9e9116ac264d9257bab47bbaa2adb6643ef3"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.612870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" event={"ID":"57a14562-fc08-4785-a24c-ead1cb0919e6","Type":"ContainerStarted","Data":"159dae81c04407f76b488476d787fa53715a6ee741fc8c8b6b1532d79e51ea75"} Oct 06 08:36:24 crc kubenswrapper[4755]: I1006 08:36:24.630950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" event={"ID":"96e87134-c1a1-49fb-9e05-59e10699741f","Type":"ContainerStarted","Data":"471edecf4837463096b04672b7517280e4de9aef108da1357b3f1d7bc1caf6bb"} Oct 06 08:36:25 crc kubenswrapper[4755]: W1006 08:36:25.284957 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda151d352_3084_45b5_80b6_48510de0f087.slice/crio-3356e8f39cec6289a8dd99a655778670ace878dbdae4486e29745d7ba85eeb69 WatchSource:0}: Error finding container 3356e8f39cec6289a8dd99a655778670ace878dbdae4486e29745d7ba85eeb69: Status 404 returned error can't find the container with id 3356e8f39cec6289a8dd99a655778670ace878dbdae4486e29745d7ba85eeb69 Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.644413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" event={"ID":"a151d352-3084-45b5-80b6-48510de0f087","Type":"ContainerStarted","Data":"3356e8f39cec6289a8dd99a655778670ace878dbdae4486e29745d7ba85eeb69"} Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.646018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" event={"ID":"746ef71e-2879-4789-879d-a4479700346e","Type":"ContainerStarted","Data":"0bb2d603ec423165fe8e512a199f0737de19da2471bed1d09b5dcf4c7fc079b1"} Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.647330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" event={"ID":"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4","Type":"ContainerStarted","Data":"cd8de7ee791aedfe107aa410d070afe089dfe8e8c9706edb6089c789113cc4ef"} Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.649033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" event={"ID":"cc16a1c7-6450-414d-9e4b-518014071887","Type":"ContainerStarted","Data":"3f6a44609f5f54339105cf26fd88ca5f80c3fbf9a8eed42fc8a0ad17cb14672c"} Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.650076 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" event={"ID":"1091a8d9-172e-4016-b354-16329cdab528","Type":"ContainerStarted","Data":"e6374ed218d4e00b5fe0a19bde89e19592fd79bbdd14a15a8ebb145dd78d95b6"} Oct 06 08:36:25 crc kubenswrapper[4755]: I1006 08:36:25.651206 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" event={"ID":"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8","Type":"ContainerStarted","Data":"3a7f897e35b7ec6fe9aec886e1a72a233724463895ee3296b376fbc274a89b58"} Oct 06 08:36:27 crc kubenswrapper[4755]: I1006 08:36:27.374505 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9"] Oct 06 08:36:32 crc kubenswrapper[4755]: W1006 08:36:32.011539 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a519b4b_8c97_4154_b87c_2cbd91e4453b.slice/crio-708bdbc0e1eecb90de1a5ad4dba89f953d7c97343e0832e841501d9f34790e05 WatchSource:0}: Error finding container 708bdbc0e1eecb90de1a5ad4dba89f953d7c97343e0832e841501d9f34790e05: Status 404 returned error can't find the container with id 708bdbc0e1eecb90de1a5ad4dba89f953d7c97343e0832e841501d9f34790e05 Oct 06 08:36:32 crc kubenswrapper[4755]: I1006 08:36:32.709354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" event={"ID":"1a519b4b-8c97-4154-b87c-2cbd91e4453b","Type":"ContainerStarted","Data":"708bdbc0e1eecb90de1a5ad4dba89f953d7c97343e0832e841501d9f34790e05"} Oct 06 08:36:37 crc kubenswrapper[4755]: I1006 08:36:37.744783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" event={"ID":"f9f27028-c7a1-4bee-bb82-41c4b0354da1","Type":"ContainerStarted","Data":"5e93470fc8247db176baf872ad93bd3da9504fb4c16f6cab16f46f4fc5d3e7a7"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.809841 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" event={"ID":"206b28f4-45a9-4352-bb98-717c408dfcac","Type":"ContainerStarted","Data":"8439c9fa4fdc915d41c2d0f1e87d81c32d725f40168d02538a37cd9a202179ca"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.812751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" event={"ID":"cc16a1c7-6450-414d-9e4b-518014071887","Type":"ContainerStarted","Data":"8cb3dc4f3e0d7faee53841d44c7c55ae1442d2a0f54c5652490c0d797c9132b9"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.814145 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" event={"ID":"1091a8d9-172e-4016-b354-16329cdab528","Type":"ContainerStarted","Data":"e865d300859b0d0a983ce3d489d8f36cf9a06bdfb0932000e8c4457c97c50c31"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.832180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" event={"ID":"c0fe017f-b521-4146-a2fe-7b790d585e22","Type":"ContainerStarted","Data":"e3d65a9ab7df9914c88060bbf25362f509a1bca8103dac21edcaca30a6e39ce2"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.848870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" event={"ID":"96e87134-c1a1-49fb-9e05-59e10699741f","Type":"ContainerStarted","Data":"98a874010d30853ee7641104a6177de3ced3c30ee858da3649edd5b102b280ee"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.850707 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" event={"ID":"169c4b1e-417b-4ddf-9886-6c4668257712","Type":"ContainerStarted","Data":"4f989ae66349490c05517f86ef3f0e08da6ef8c503b85dd3f0811bf3e1141fb7"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.851809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" event={"ID":"57a14562-fc08-4785-a24c-ead1cb0919e6","Type":"ContainerStarted","Data":"9c0fc48ec8ab1567f00618bd2b91d510866ade7c4d2a3819e53038a821d344e2"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.881453 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" event={"ID":"1412ad22-876d-4924-9f9b-468970063426","Type":"ContainerStarted","Data":"f16e1ee737a90066ac57462496f9f87c3eca552479ffe36a9d4b303661aed59c"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.882268 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" event={"ID":"1412ad22-876d-4924-9f9b-468970063426","Type":"ContainerStarted","Data":"8e9b8eb809c4e7b30d738b550f4c459001467e874eb2973ce741024ce6377c11"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.883273 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.919878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" event={"ID":"26edd385-18c0-41cf-8094-e1844f07364a","Type":"ContainerStarted","Data":"e69845f908b8a99b78e73d4a7bedb21b2df64dbba6b1ccacbd2f260f7b6babbf"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.924226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" event={"ID":"746ef71e-2879-4789-879d-a4479700346e","Type":"ContainerStarted","Data":"0cd18d8f1972052ddc72c43d747c92d17458db2b76a5d170941e30c90df52640"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.938337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" event={"ID":"771c3503-d156-46de-81b3-fe3845ecd58f","Type":"ContainerStarted","Data":"3800203e5d4c9d4edbf4ec7639de8eb962330a001d0f1c04f1accbf33361e7a5"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.943641 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" event={"ID":"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4","Type":"ContainerStarted","Data":"37afe68a5a3c13a63d97420678e0186da31f3ed2012e0f82e43d89d4dd9e7f72"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.945155 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.946936 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" event={"ID":"5e7b409c-75ff-43bf-87e1-9a7877dd21f3","Type":"ContainerStarted","Data":"70213c8175ad38163541e0a838facccbcff92cb84c033eb9ef0c2df6474aa6be"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.948587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" event={"ID":"fa923997-ddf0-4e0a-9ef9-bde22b553dfb","Type":"ContainerStarted","Data":"32e5d61f8d34b522ad6cf73e922806ccf70702c38a4f539a3fb9df1defeb0e86"} Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.961279 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" podStartSLOduration=13.209951276 podStartE2EDuration="18.961252957s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:21.795128588 +0000 UTC m=+838.624443802" lastFinishedPulling="2025-10-06 08:36:27.546430269 +0000 UTC m=+844.375745483" observedRunningTime="2025-10-06 08:36:38.956659633 +0000 UTC m=+855.785974857" watchObservedRunningTime="2025-10-06 08:36:38.961252957 +0000 UTC m=+855.790568171" Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.962615 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb" podStartSLOduration=6.888411061 podStartE2EDuration="18.962609884s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:25.297109915 +0000 UTC m=+842.126425129" lastFinishedPulling="2025-10-06 08:36:37.371308738 +0000 UTC m=+854.200623952" observedRunningTime="2025-10-06 08:36:38.859934293 +0000 UTC m=+855.689249517" watchObservedRunningTime="2025-10-06 08:36:38.962609884 +0000 UTC m=+855.791925098" Oct 06 08:36:38 crc kubenswrapper[4755]: I1006 08:36:38.981474 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" event={"ID":"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8","Type":"ContainerStarted","Data":"933a3d0e0a58fd24ea8aae083835ddde25c6502e2f7c97af381ae9722b6cd464"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.021348 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" event={"ID":"a151d352-3084-45b5-80b6-48510de0f087","Type":"ContainerStarted","Data":"34c5d33b06809b5651533d950476a5a9790a7653f4108dda65dcf268a4656027"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.029552 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" podStartSLOduration=6.971869352 podStartE2EDuration="19.029533947s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:25.343887732 +0000 UTC m=+842.173202946" lastFinishedPulling="2025-10-06 08:36:37.401552327 +0000 UTC m=+854.230867541" observedRunningTime="2025-10-06 08:36:39.025074696 +0000 UTC m=+855.854389910" watchObservedRunningTime="2025-10-06 08:36:39.029533947 +0000 UTC m=+855.858849161" Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.052499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" event={"ID":"d33c2461-9722-468a-b6e4-20b4ce822f18","Type":"ContainerStarted","Data":"00b1275ce0ec2cbe9126b74378371a0d6cf9b66a9de90f61d1a5ef2a9a47ee19"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.054744 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.097419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" event={"ID":"2c9d42bf-9896-4198-aeb9-352d080978d0","Type":"ContainerStarted","Data":"c29922f6178a5e98ef6abc9597abceac37a5f1ab9f3bd125bd7a5e43059e0661"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.109820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" event={"ID":"73a255d0-1e8a-43c9-b27c-e7ff650c3c79","Type":"ContainerStarted","Data":"c3bc29403574618a41a5b30b4dd55ae7233aaa79b11aad672b0fa35efdb13121"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.133465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" event={"ID":"1a519b4b-8c97-4154-b87c-2cbd91e4453b","Type":"ContainerStarted","Data":"d2b56bfbfd9b881738774f3471fec7fb94cd34255cd7e2911a08d015e77089de"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.147463 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" podStartSLOduration=13.481394412 podStartE2EDuration="19.147126422s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:21.88073316 +0000 UTC m=+838.710048374" lastFinishedPulling="2025-10-06 08:36:27.54646517 +0000 UTC m=+844.375780384" observedRunningTime="2025-10-06 08:36:39.124875849 +0000 UTC m=+855.954191063" watchObservedRunningTime="2025-10-06 08:36:39.147126422 +0000 UTC m=+855.976441636" Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.164374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" event={"ID":"51cb4df4-8c6a-4563-bd07-9c05182b4216","Type":"ContainerStarted","Data":"81c7f34f9b354f5533a4182adba51c91c5a505e2018b04a9d503c70e5fe10a15"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.182219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" event={"ID":"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2","Type":"ContainerStarted","Data":"2ec5d42dc2e71d29226271e40b9a17e027b2610a4a9820fcdc235e7a2b373ad5"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.203624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" event={"ID":"f9f27028-c7a1-4bee-bb82-41c4b0354da1","Type":"ContainerStarted","Data":"91b8b37d82b47679ece2809b4069a8ffc57a25f85a74550194d7e3b5a3782ea9"} Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.204031 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:39 crc kubenswrapper[4755]: I1006 08:36:39.297548 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" podStartSLOduration=19.297531376 podStartE2EDuration="19.297531376s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:36:39.287300029 +0000 UTC m=+856.116615243" watchObservedRunningTime="2025-10-06 08:36:39.297531376 +0000 UTC m=+856.126846590" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.212094 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" event={"ID":"206b28f4-45a9-4352-bb98-717c408dfcac","Type":"ContainerStarted","Data":"e1c25f05acb25a79963c4980fe6cf9562bc0d689d350e9139bdb7caca7c0397a"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.213639 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.213960 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" event={"ID":"771c3503-d156-46de-81b3-fe3845ecd58f","Type":"ContainerStarted","Data":"a8ba34dc310706df5a269c7d8cdd0b06b93f2e430be0bb93c743d1d8fcdb04f7"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.216679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" event={"ID":"746ef71e-2879-4789-879d-a4479700346e","Type":"ContainerStarted","Data":"845d3ec06f0a04499569042c40d5f796c90a9e4bf12f018dbfcce0da8612e214"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.216779 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.218363 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" event={"ID":"5e7b409c-75ff-43bf-87e1-9a7877dd21f3","Type":"ContainerStarted","Data":"4f8b32ae3846814f97601bdf9095c6962eb02807939eb1b527e25fd9d2eef0f2"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.218620 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.220167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" event={"ID":"51cb4df4-8c6a-4563-bd07-9c05182b4216","Type":"ContainerStarted","Data":"b5185e8c308b04f6345a836f3a0fa9a602fef98717a579544bd1292f10be1256"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.220293 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.221884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" event={"ID":"188c5ff1-ba40-4c30-b411-d5beb8cdb4e8","Type":"ContainerStarted","Data":"3a2c713aa805a99fe8575932ef716c1ed20ed28a1c568290ff7f3225f6249ddd"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.222018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.223633 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" event={"ID":"96e87134-c1a1-49fb-9e05-59e10699741f","Type":"ContainerStarted","Data":"4f0f795f7382bb654da6a05b4e52f73b235600bda27de9b91fe0d0117305a408"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.224099 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.225556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" event={"ID":"2c9d42bf-9896-4198-aeb9-352d080978d0","Type":"ContainerStarted","Data":"6d547b01efcec1f63ab38ec5032d61c69f9a2529da8ee4b97f2e52da3416e105"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.226074 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.227716 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" event={"ID":"d33c2461-9722-468a-b6e4-20b4ce822f18","Type":"ContainerStarted","Data":"6bad6e953ec8544241bca491f6fa82299ff509d6c624863cd18c589ce8442029"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.239852 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" event={"ID":"cc16e4a5-7b17-4f64-840e-1d0f6971c7a4","Type":"ContainerStarted","Data":"fef9b7ca1b64c9cc886e55091007953432d3f761682e7096f5b729718d138c71"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.241995 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" event={"ID":"c0fe017f-b521-4146-a2fe-7b790d585e22","Type":"ContainerStarted","Data":"447c448db7b37f4a07d2a4f9226d84330da0df8b3df0f79cddf901b1099e3082"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.242116 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.243610 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" event={"ID":"fa923997-ddf0-4e0a-9ef9-bde22b553dfb","Type":"ContainerStarted","Data":"3803ae45054c8057e36f1af5edbce474c8ef8d393c49d83349ff7711c54c9e83"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.244357 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.245315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" event={"ID":"a151d352-3084-45b5-80b6-48510de0f087","Type":"ContainerStarted","Data":"d47a5c348c84a91209df534a239c8467b2d37e600ed8e587bcdb21f9718e24e7"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.245530 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.250422 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" event={"ID":"57a14562-fc08-4785-a24c-ead1cb0919e6","Type":"ContainerStarted","Data":"23ff394dfe4b42b6d77d2c6e681c3e1dcbb197ad6040f0d23c4f52067002b417"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.250823 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.253177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" event={"ID":"169c4b1e-417b-4ddf-9886-6c4668257712","Type":"ContainerStarted","Data":"8ce950895cae3707a30120936f0083fcbb86e4fd84846d52f6c50e818f0f7926"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.253382 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" podStartSLOduration=9.110985487 podStartE2EDuration="20.253370908s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:21.329777207 +0000 UTC m=+838.159092421" lastFinishedPulling="2025-10-06 08:36:32.472162618 +0000 UTC m=+849.301477842" observedRunningTime="2025-10-06 08:36:40.252588977 +0000 UTC m=+857.081904201" watchObservedRunningTime="2025-10-06 08:36:40.253370908 +0000 UTC m=+857.082686122" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.253629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.254873 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" podStartSLOduration=5.207863438 podStartE2EDuration="20.254867559s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:22.180373079 +0000 UTC m=+839.009688293" lastFinishedPulling="2025-10-06 08:36:37.2273772 +0000 UTC m=+854.056692414" observedRunningTime="2025-10-06 08:36:40.23722063 +0000 UTC m=+857.066535854" watchObservedRunningTime="2025-10-06 08:36:40.254867559 +0000 UTC m=+857.084182773" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.256645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" event={"ID":"73a255d0-1e8a-43c9-b27c-e7ff650c3c79","Type":"ContainerStarted","Data":"f1f1e694066e9142ecbdf55e4154ef4a1e9ec6f6e9ece2ecf4a2356b343743bd"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.257249 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.258689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" event={"ID":"cc16a1c7-6450-414d-9e4b-518014071887","Type":"ContainerStarted","Data":"737288f4b313b2d139bd277ff3738890c733eced885930f1352cd5a58763a422"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.262997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" event={"ID":"1a519b4b-8c97-4154-b87c-2cbd91e4453b","Type":"ContainerStarted","Data":"b1b1d42bb4ffd053ae831f1cd2a60156d5bb7463ab58010478e8eb35c54e77e2"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.263155 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.265223 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" event={"ID":"5fbfa495-f4d7-4bf8-a489-f8d24476fbf2","Type":"ContainerStarted","Data":"abe60f8271f8ca6264646c5cda7f841368f2312d3ba20106f4c6dd86b7f9a80a"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.265707 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.271679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" event={"ID":"26edd385-18c0-41cf-8094-e1844f07364a","Type":"ContainerStarted","Data":"b50d5ba69c99949e30a1dda6edbb4ad7c9538c54681ddb7b334dc19202c056e1"} Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.271711 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.309420 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" podStartSLOduration=8.299489176 podStartE2EDuration="20.309398896s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:25.30389745 +0000 UTC m=+842.133212654" lastFinishedPulling="2025-10-06 08:36:37.31380716 +0000 UTC m=+854.143122374" observedRunningTime="2025-10-06 08:36:40.30294043 +0000 UTC m=+857.132255654" watchObservedRunningTime="2025-10-06 08:36:40.309398896 +0000 UTC m=+857.138714110" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.309533 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" podStartSLOduration=7.666869246 podStartE2EDuration="20.309529209s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.584266374 +0000 UTC m=+841.413581588" lastFinishedPulling="2025-10-06 08:36:37.226926327 +0000 UTC m=+854.056241551" observedRunningTime="2025-10-06 08:36:40.275222589 +0000 UTC m=+857.104537813" watchObservedRunningTime="2025-10-06 08:36:40.309529209 +0000 UTC m=+857.138844423" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.329361 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" podStartSLOduration=7.702190802 podStartE2EDuration="20.329342126s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.599370262 +0000 UTC m=+841.428685476" lastFinishedPulling="2025-10-06 08:36:37.226521586 +0000 UTC m=+854.055836800" observedRunningTime="2025-10-06 08:36:40.324963997 +0000 UTC m=+857.154279211" watchObservedRunningTime="2025-10-06 08:36:40.329342126 +0000 UTC m=+857.158657340" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.362872 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" podStartSLOduration=7.146007157 podStartE2EDuration="20.362852233s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:21.876586659 +0000 UTC m=+838.705901873" lastFinishedPulling="2025-10-06 08:36:35.093431725 +0000 UTC m=+851.922746949" observedRunningTime="2025-10-06 08:36:40.358990299 +0000 UTC m=+857.188305513" watchObservedRunningTime="2025-10-06 08:36:40.362852233 +0000 UTC m=+857.192167447" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.386757 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" podStartSLOduration=6.711191154 podStartE2EDuration="20.38673672s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:23.550410915 +0000 UTC m=+840.379726139" lastFinishedPulling="2025-10-06 08:36:37.225956491 +0000 UTC m=+854.055271705" observedRunningTime="2025-10-06 08:36:40.384808689 +0000 UTC m=+857.214123903" watchObservedRunningTime="2025-10-06 08:36:40.38673672 +0000 UTC m=+857.216051934" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.406176 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" podStartSLOduration=7.701966785 podStartE2EDuration="20.406156257s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.523416494 +0000 UTC m=+841.352731708" lastFinishedPulling="2025-10-06 08:36:37.227605966 +0000 UTC m=+854.056921180" observedRunningTime="2025-10-06 08:36:40.39963327 +0000 UTC m=+857.228948494" watchObservedRunningTime="2025-10-06 08:36:40.406156257 +0000 UTC m=+857.235471471" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.430681 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" podStartSLOduration=7.73031668 podStartE2EDuration="20.43065831s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.527718828 +0000 UTC m=+841.357034042" lastFinishedPulling="2025-10-06 08:36:37.228060458 +0000 UTC m=+854.057375672" observedRunningTime="2025-10-06 08:36:40.425624154 +0000 UTC m=+857.254939368" watchObservedRunningTime="2025-10-06 08:36:40.43065831 +0000 UTC m=+857.259973524" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.450112 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" podStartSLOduration=7.7959559259999995 podStartE2EDuration="20.450095406s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.571798221 +0000 UTC m=+841.401113435" lastFinishedPulling="2025-10-06 08:36:37.225937701 +0000 UTC m=+854.055252915" observedRunningTime="2025-10-06 08:36:40.446672814 +0000 UTC m=+857.275988038" watchObservedRunningTime="2025-10-06 08:36:40.450095406 +0000 UTC m=+857.279410620" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.479966 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" podStartSLOduration=8.359556776 podStartE2EDuration="20.479945576s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:25.302526916 +0000 UTC m=+842.131842130" lastFinishedPulling="2025-10-06 08:36:37.422915716 +0000 UTC m=+854.252230930" observedRunningTime="2025-10-06 08:36:40.473540252 +0000 UTC m=+857.302855476" watchObservedRunningTime="2025-10-06 08:36:40.479945576 +0000 UTC m=+857.309260790" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.535921 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" podStartSLOduration=7.879603043 podStartE2EDuration="20.535893941s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.571796821 +0000 UTC m=+841.401112035" lastFinishedPulling="2025-10-06 08:36:37.228087719 +0000 UTC m=+854.057402933" observedRunningTime="2025-10-06 08:36:40.508676543 +0000 UTC m=+857.337991777" watchObservedRunningTime="2025-10-06 08:36:40.535893941 +0000 UTC m=+857.365209155" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.538257 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" podStartSLOduration=7.626253525 podStartE2EDuration="20.538251144s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:22.181428085 +0000 UTC m=+839.010743299" lastFinishedPulling="2025-10-06 08:36:35.093425704 +0000 UTC m=+851.922740918" observedRunningTime="2025-10-06 08:36:40.532787436 +0000 UTC m=+857.362102650" watchObservedRunningTime="2025-10-06 08:36:40.538251144 +0000 UTC m=+857.367566348" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.562867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" podStartSLOduration=15.205655423 podStartE2EDuration="20.562847201s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:32.029821006 +0000 UTC m=+848.859136220" lastFinishedPulling="2025-10-06 08:36:37.387012784 +0000 UTC m=+854.216327998" observedRunningTime="2025-10-06 08:36:40.560803815 +0000 UTC m=+857.390119039" watchObservedRunningTime="2025-10-06 08:36:40.562847201 +0000 UTC m=+857.392162415" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.593199 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" podStartSLOduration=7.954484518 podStartE2EDuration="20.593172703s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.588217581 +0000 UTC m=+841.417532795" lastFinishedPulling="2025-10-06 08:36:37.226905766 +0000 UTC m=+854.056220980" observedRunningTime="2025-10-06 08:36:40.588736802 +0000 UTC m=+857.418052026" watchObservedRunningTime="2025-10-06 08:36:40.593172703 +0000 UTC m=+857.422487917" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.618874 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" podStartSLOduration=8.695108522 podStartE2EDuration="20.618847488s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:25.303690235 +0000 UTC m=+842.133005449" lastFinishedPulling="2025-10-06 08:36:37.227429161 +0000 UTC m=+854.056744415" observedRunningTime="2025-10-06 08:36:40.615337963 +0000 UTC m=+857.444653177" watchObservedRunningTime="2025-10-06 08:36:40.618847488 +0000 UTC m=+857.448162702" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.638119 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" podStartSLOduration=7.839533364 podStartE2EDuration="20.63809755s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:24.58779107 +0000 UTC m=+841.417106284" lastFinishedPulling="2025-10-06 08:36:37.386355256 +0000 UTC m=+854.215670470" observedRunningTime="2025-10-06 08:36:40.636724213 +0000 UTC m=+857.466039437" watchObservedRunningTime="2025-10-06 08:36:40.63809755 +0000 UTC m=+857.467412764" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.659717 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" podStartSLOduration=6.955514648 podStartE2EDuration="20.659693594s" podCreationTimestamp="2025-10-06 08:36:20 +0000 UTC" firstStartedPulling="2025-10-06 08:36:23.523210904 +0000 UTC m=+840.352526118" lastFinishedPulling="2025-10-06 08:36:37.22738985 +0000 UTC m=+854.056705064" observedRunningTime="2025-10-06 08:36:40.656042655 +0000 UTC m=+857.485357869" watchObservedRunningTime="2025-10-06 08:36:40.659693594 +0000 UTC m=+857.489008808" Oct 06 08:36:40 crc kubenswrapper[4755]: I1006 08:36:40.810529 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:41 crc kubenswrapper[4755]: I1006 08:36:41.018273 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.567682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-78fdc95566-rdwj9" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.625629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-zs28l" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.628942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-w6rtg" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.647971 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-jnt7p" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.670169 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-9qkff" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.694283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-ld9kw" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.718091 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6nqcm" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.751675 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-6kmrp" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.792042 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-zkdrt" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.813876 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-lbg2n" Oct 06 08:36:50 crc kubenswrapper[4755]: I1006 08:36:50.982213 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.007522 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-6cfbt" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.020021 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-nk299" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.046301 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-k5658" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.076846 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-gpblg" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.138074 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-555f5" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.167283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-n6gl6" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.181361 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dbmlb" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.277075 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-8sf8c" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.363235 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gjkcw" Oct 06 08:36:51 crc kubenswrapper[4755]: I1006 08:36:51.539837 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7f9f8b87ff-hgpl2" Oct 06 08:36:52 crc kubenswrapper[4755]: I1006 08:36:52.614965 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.393218 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.397657 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.404358 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.404782 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.404946 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.409835 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wj9j5" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.411018 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.471889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrks\" (UniqueName: \"kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.471996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.484668 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.486977 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.489325 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.492260 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.573375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzrk\" (UniqueName: \"kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.573438 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrks\" (UniqueName: \"kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.573470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.573521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.573543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.576629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.595355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrks\" (UniqueName: \"kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks\") pod \"dnsmasq-dns-675f4bcbfc-5m7v2\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.674354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.674448 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzrk\" (UniqueName: \"kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.674480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.676055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.676416 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.694354 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzrk\" (UniqueName: \"kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk\") pod \"dnsmasq-dns-78dd6ddcc-zjfvd\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.738551 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:09 crc kubenswrapper[4755]: I1006 08:37:09.808086 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:10 crc kubenswrapper[4755]: I1006 08:37:10.152066 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:10 crc kubenswrapper[4755]: W1006 08:37:10.156582 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2915d797_6ffc_4282_b5a9_85aa4ef0e378.slice/crio-c9b9237e25af14e58fa36742888511806960e258d8bd564e4d6dd10bb26fb2e4 WatchSource:0}: Error finding container c9b9237e25af14e58fa36742888511806960e258d8bd564e4d6dd10bb26fb2e4: Status 404 returned error can't find the container with id c9b9237e25af14e58fa36742888511806960e258d8bd564e4d6dd10bb26fb2e4 Oct 06 08:37:10 crc kubenswrapper[4755]: I1006 08:37:10.159305 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:37:10 crc kubenswrapper[4755]: I1006 08:37:10.239655 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:10 crc kubenswrapper[4755]: W1006 08:37:10.239872 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d07c38_bd17_4b54_95ab_d13d53524497.slice/crio-f3c0bc13ed8d37786f3080a95eede6874d719be0240155bd67e027cafa770aad WatchSource:0}: Error finding container f3c0bc13ed8d37786f3080a95eede6874d719be0240155bd67e027cafa770aad: Status 404 returned error can't find the container with id f3c0bc13ed8d37786f3080a95eede6874d719be0240155bd67e027cafa770aad Oct 06 08:37:10 crc kubenswrapper[4755]: I1006 08:37:10.529062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" event={"ID":"b1d07c38-bd17-4b54-95ab-d13d53524497","Type":"ContainerStarted","Data":"f3c0bc13ed8d37786f3080a95eede6874d719be0240155bd67e027cafa770aad"} Oct 06 08:37:10 crc kubenswrapper[4755]: I1006 08:37:10.530841 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" event={"ID":"2915d797-6ffc-4282-b5a9-85aa4ef0e378","Type":"ContainerStarted","Data":"c9b9237e25af14e58fa36742888511806960e258d8bd564e4d6dd10bb26fb2e4"} Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.408863 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.428618 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.429887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.444644 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.521153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbzt\" (UniqueName: \"kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.521352 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.521422 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.622704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbzt\" (UniqueName: \"kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.622789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.622828 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.623859 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.624403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.651360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbzt\" (UniqueName: \"kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt\") pod \"dnsmasq-dns-666b6646f7-pwl7d\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.706128 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.726846 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.729873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.740816 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.757806 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.826196 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr6rk\" (UniqueName: \"kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.826289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.826335 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.927953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr6rk\" (UniqueName: \"kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.928043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.928077 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.930367 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.931056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:12 crc kubenswrapper[4755]: I1006 08:37:12.951735 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr6rk\" (UniqueName: \"kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk\") pod \"dnsmasq-dns-57d769cc4f-s2p7n\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.062052 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.569025 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.575627 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.580417 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.580612 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.580832 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.581022 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.581147 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.581265 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-47jjq" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.581587 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.584058 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638417 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638511 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638639 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638673 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638701 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6t7j\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.638820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740435 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740485 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740606 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6t7j\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740773 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.740789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.742186 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.745789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.745852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.746139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.747685 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.748600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.748828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.749801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.749796 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.751549 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.763468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6t7j\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.769020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.788149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.848499 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.853202 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.857456 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.857790 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.857985 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.858101 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.858191 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.858281 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.859783 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4wqc7" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.866196 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.936069 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951075 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786md\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951147 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951229 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951276 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:13 crc kubenswrapper[4755]: I1006 08:37:13.951389 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.052712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.052761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.052815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053510 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786md\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053576 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053606 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053650 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.053889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.054047 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.054364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.054425 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.054521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.054666 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.055375 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.059303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.059493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.060007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.065403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.069068 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786md\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.080420 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:14 crc kubenswrapper[4755]: I1006 08:37:14.182043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.402096 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.404721 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.407904 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.411846 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.412681 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.412739 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bws9g" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.413034 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.419005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.425450 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.475969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppm4n\" (UniqueName: \"kubernetes.io/projected/b13f13fe-a34f-4566-b0bd-31b326722b01-kube-api-access-ppm4n\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-secrets\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-default\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476304 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-kolla-config\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.476337 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580438 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppm4n\" (UniqueName: \"kubernetes.io/projected/b13f13fe-a34f-4566-b0bd-31b326722b01-kube-api-access-ppm4n\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580484 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-secrets\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580505 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580538 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-default\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.580611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-kolla-config\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.581671 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-kolla-config\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.581828 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.583908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.584333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-config-data-default\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.584676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b13f13fe-a34f-4566-b0bd-31b326722b01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.590322 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.590629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-secrets\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.595684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13f13fe-a34f-4566-b0bd-31b326722b01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.610140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppm4n\" (UniqueName: \"kubernetes.io/projected/b13f13fe-a34f-4566-b0bd-31b326722b01-kube-api-access-ppm4n\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.612254 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b13f13fe-a34f-4566-b0bd-31b326722b01\") " pod="openstack/openstack-galera-0" Oct 06 08:37:15 crc kubenswrapper[4755]: I1006 08:37:15.728904 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.375856 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.381266 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.383916 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-224s9" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.384070 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.384220 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.384256 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.395553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492737 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492848 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwmc\" (UniqueName: \"kubernetes.io/projected/0ea480ba-e1ea-47db-b647-39833517fcad-kube-api-access-ptwmc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492966 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.492992 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.493014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.493064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.493122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594778 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwmc\" (UniqueName: \"kubernetes.io/projected/0ea480ba-e1ea-47db-b647-39833517fcad-kube-api-access-ptwmc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.594931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.596747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.597453 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.597698 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.598861 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.599720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ea480ba-e1ea-47db-b647-39833517fcad-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.601957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.606530 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.612589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0ea480ba-e1ea-47db-b647-39833517fcad-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.622030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwmc\" (UniqueName: \"kubernetes.io/projected/0ea480ba-e1ea-47db-b647-39833517fcad-kube-api-access-ptwmc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.633166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0ea480ba-e1ea-47db-b647-39833517fcad\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.705403 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.777183 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.778587 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.780375 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.780627 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bx8f8" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.781254 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.794656 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.899219 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-config-data\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.899539 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.899599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxh6v\" (UniqueName: \"kubernetes.io/projected/70279926-92db-4788-b714-f14f60f4c55d-kube-api-access-gxh6v\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.899617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-kolla-config\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:16 crc kubenswrapper[4755]: I1006 08:37:16.899653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.001059 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.001147 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-config-data\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.001175 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.001235 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxh6v\" (UniqueName: \"kubernetes.io/projected/70279926-92db-4788-b714-f14f60f4c55d-kube-api-access-gxh6v\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.001257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-kolla-config\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.002020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-kolla-config\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.002636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70279926-92db-4788-b714-f14f60f4c55d-config-data\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.004487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.015280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70279926-92db-4788-b714-f14f60f4c55d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.021030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxh6v\" (UniqueName: \"kubernetes.io/projected/70279926-92db-4788-b714-f14f60f4c55d-kube-api-access-gxh6v\") pod \"memcached-0\" (UID: \"70279926-92db-4788-b714-f14f60f4c55d\") " pod="openstack/memcached-0" Oct 06 08:37:17 crc kubenswrapper[4755]: I1006 08:37:17.095751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.585145 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.586484 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.589667 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n82zm" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.599835 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.725776 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8vb\" (UniqueName: \"kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb\") pod \"kube-state-metrics-0\" (UID: \"1fd020a3-7f41-424d-acd4-0e06764fafb3\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.826722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8vb\" (UniqueName: \"kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb\") pod \"kube-state-metrics-0\" (UID: \"1fd020a3-7f41-424d-acd4-0e06764fafb3\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.848266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8vb\" (UniqueName: \"kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb\") pod \"kube-state-metrics-0\" (UID: \"1fd020a3-7f41-424d-acd4-0e06764fafb3\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:18 crc kubenswrapper[4755]: I1006 08:37:18.904140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.974738 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b4rd2"] Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.976240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.980989 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.981234 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.981363 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fgwtz" Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.989837 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2"] Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.996615 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jd94b"] Oct 06 08:37:21 crc kubenswrapper[4755]: I1006 08:37:21.998749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.023462 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jd94b"] Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.076917 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67fp\" (UniqueName: \"kubernetes.io/projected/5dbdee79-0740-4068-a155-e865fe787402-kube-api-access-f67fp\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077018 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dbdee79-0740-4068-a155-e865fe787402-scripts\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077073 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-log-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077144 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-ovn-controller-tls-certs\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.077170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-combined-ca-bundle\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178335 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-ovn-controller-tls-certs\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178461 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-combined-ca-bundle\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-run\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178527 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67fp\" (UniqueName: \"kubernetes.io/projected/5dbdee79-0740-4068-a155-e865fe787402-kube-api-access-f67fp\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a6dfcfa-6e0a-427c-88df-0619afb0195c-scripts\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178598 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ps5f\" (UniqueName: \"kubernetes.io/projected/5a6dfcfa-6e0a-427c-88df-0619afb0195c-kube-api-access-9ps5f\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-log\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dbdee79-0740-4068-a155-e865fe787402-scripts\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178691 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-lib\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-log-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.178953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.179016 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-etc-ovs\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.179206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-log-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.180865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run-ovn\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.181194 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dbdee79-0740-4068-a155-e865fe787402-scripts\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.181205 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dbdee79-0740-4068-a155-e865fe787402-var-run\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.184244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-ovn-controller-tls-certs\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.198329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbdee79-0740-4068-a155-e865fe787402-combined-ca-bundle\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.202851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67fp\" (UniqueName: \"kubernetes.io/projected/5dbdee79-0740-4068-a155-e865fe787402-kube-api-access-f67fp\") pod \"ovn-controller-b4rd2\" (UID: \"5dbdee79-0740-4068-a155-e865fe787402\") " pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.280195 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ps5f\" (UniqueName: \"kubernetes.io/projected/5a6dfcfa-6e0a-427c-88df-0619afb0195c-kube-api-access-9ps5f\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.280330 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-log\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.280629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-log\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.280715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-lib\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.280837 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-etc-ovs\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.281020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-lib\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.281043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-etc-ovs\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.281126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-run\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.281162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a6dfcfa-6e0a-427c-88df-0619afb0195c-scripts\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.281695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a6dfcfa-6e0a-427c-88df-0619afb0195c-var-run\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.289552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a6dfcfa-6e0a-427c-88df-0619afb0195c-scripts\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.297783 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ps5f\" (UniqueName: \"kubernetes.io/projected/5a6dfcfa-6e0a-427c-88df-0619afb0195c-kube-api-access-9ps5f\") pod \"ovn-controller-ovs-jd94b\" (UID: \"5a6dfcfa-6e0a-427c-88df-0619afb0195c\") " pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.302625 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:22 crc kubenswrapper[4755]: I1006 08:37:22.322825 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.393847 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.399294 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.401470 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.401765 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.401917 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.402217 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.402386 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gcmqm" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.405033 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4774g\" (UniqueName: \"kubernetes.io/projected/3327c559-a028-4094-be53-cc5c7c116a6f-kube-api-access-4774g\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502276 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-config\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.502438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603758 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4774g\" (UniqueName: \"kubernetes.io/projected/3327c559-a028-4094-be53-cc5c7c116a6f-kube-api-access-4774g\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603857 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-config\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.603948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.604533 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.604824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.604930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-config\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.605297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3327c559-a028-4094-be53-cc5c7c116a6f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.609073 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.609545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.609708 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3327c559-a028-4094-be53-cc5c7c116a6f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.629919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4774g\" (UniqueName: \"kubernetes.io/projected/3327c559-a028-4094-be53-cc5c7c116a6f-kube-api-access-4774g\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.643848 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3327c559-a028-4094-be53-cc5c7c116a6f\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:23 crc kubenswrapper[4755]: I1006 08:37:23.721292 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.951737 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.951921 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnzrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zjfvd_openstack(b1d07c38-bd17-4b54-95ab-d13d53524497): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.953008 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" podUID="b1d07c38-bd17-4b54-95ab-d13d53524497" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.955954 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.956095 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-5m7v2_openstack(2915d797-6ffc-4282-b5a9-85aa4ef0e378): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:24 crc kubenswrapper[4755]: E1006 08:37:24.957295 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" podUID="2915d797-6ffc-4282-b5a9-85aa4ef0e378" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.420374 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.577210 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.579309 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.597908 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dn52q" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.598428 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.598706 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.598900 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.613376 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.646523 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.659287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.682670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"70279926-92db-4788-b714-f14f60f4c55d","Type":"ContainerStarted","Data":"959963c8ec7b1b0c3227c64014ec9f1e53f91870bb1d57d8e9e03467101d1344"} Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.685384 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerStarted","Data":"2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9"} Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.689670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerStarted","Data":"409148eeb53fd888eaa0136472fad27bf10bd19eca9215596cf23ebf0824f1fd"} Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745706 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.745977 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.746001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whp9\" (UniqueName: \"kubernetes.io/projected/d7cf61af-2469-48d4-b3e9-77267e7d5328-kube-api-access-5whp9\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.746043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whp9\" (UniqueName: \"kubernetes.io/projected/d7cf61af-2469-48d4-b3e9-77267e7d5328-kube-api-access-5whp9\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.847986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.848168 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.853166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.854247 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.854282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.854307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7cf61af-2469-48d4-b3e9-77267e7d5328-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.855284 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.856864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cf61af-2469-48d4-b3e9-77267e7d5328-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.871655 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whp9\" (UniqueName: \"kubernetes.io/projected/d7cf61af-2469-48d4-b3e9-77267e7d5328-kube-api-access-5whp9\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.889049 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7cf61af-2469-48d4-b3e9-77267e7d5328\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.936804 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.942205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.952063 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:25 crc kubenswrapper[4755]: I1006 08:37:25.962608 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.000671 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.007336 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.025975 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2"] Oct 06 08:37:26 crc kubenswrapper[4755]: W1006 08:37:26.057479 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod733a7b61_b175_4381_ade8_91bd0714c2fa.slice/crio-28b0a5b68ff7915d7e6a331838e4f99fc8fb9da17b606aa386f1f279617b9e05 WatchSource:0}: Error finding container 28b0a5b68ff7915d7e6a331838e4f99fc8fb9da17b606aa386f1f279617b9e05: Status 404 returned error can't find the container with id 28b0a5b68ff7915d7e6a331838e4f99fc8fb9da17b606aa386f1f279617b9e05 Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.084435 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.126735 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.145661 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.256307 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config\") pod \"b1d07c38-bd17-4b54-95ab-d13d53524497\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.256448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzrk\" (UniqueName: \"kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk\") pod \"b1d07c38-bd17-4b54-95ab-d13d53524497\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.256483 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config\") pod \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.256500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lrks\" (UniqueName: \"kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks\") pod \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\" (UID: \"2915d797-6ffc-4282-b5a9-85aa4ef0e378\") " Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.256639 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc\") pod \"b1d07c38-bd17-4b54-95ab-d13d53524497\" (UID: \"b1d07c38-bd17-4b54-95ab-d13d53524497\") " Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.257366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1d07c38-bd17-4b54-95ab-d13d53524497" (UID: "b1d07c38-bd17-4b54-95ab-d13d53524497"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.257381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config" (OuterVolumeSpecName: "config") pod "b1d07c38-bd17-4b54-95ab-d13d53524497" (UID: "b1d07c38-bd17-4b54-95ab-d13d53524497"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.257403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config" (OuterVolumeSpecName: "config") pod "2915d797-6ffc-4282-b5a9-85aa4ef0e378" (UID: "2915d797-6ffc-4282-b5a9-85aa4ef0e378"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.263706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks" (OuterVolumeSpecName: "kube-api-access-6lrks") pod "2915d797-6ffc-4282-b5a9-85aa4ef0e378" (UID: "2915d797-6ffc-4282-b5a9-85aa4ef0e378"). InnerVolumeSpecName "kube-api-access-6lrks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.263995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk" (OuterVolumeSpecName: "kube-api-access-nnzrk") pod "b1d07c38-bd17-4b54-95ab-d13d53524497" (UID: "b1d07c38-bd17-4b54-95ab-d13d53524497"). InnerVolumeSpecName "kube-api-access-nnzrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.357860 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnzrk\" (UniqueName: \"kubernetes.io/projected/b1d07c38-bd17-4b54-95ab-d13d53524497-kube-api-access-nnzrk\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.357897 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915d797-6ffc-4282-b5a9-85aa4ef0e378-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.357907 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lrks\" (UniqueName: \"kubernetes.io/projected/2915d797-6ffc-4282-b5a9-85aa4ef0e378-kube-api-access-6lrks\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.357918 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.357928 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d07c38-bd17-4b54-95ab-d13d53524497-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.546870 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:26 crc kubenswrapper[4755]: W1006 08:37:26.566987 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7cf61af_2469_48d4_b3e9_77267e7d5328.slice/crio-ce655d28ec801f019be2ef846dd7bf816ea890e471117e23309a9eebfe89b778 WatchSource:0}: Error finding container ce655d28ec801f019be2ef846dd7bf816ea890e471117e23309a9eebfe89b778: Status 404 returned error can't find the container with id ce655d28ec801f019be2ef846dd7bf816ea890e471117e23309a9eebfe89b778 Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.698355 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" event={"ID":"b1d07c38-bd17-4b54-95ab-d13d53524497","Type":"ContainerDied","Data":"f3c0bc13ed8d37786f3080a95eede6874d719be0240155bd67e027cafa770aad"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.698410 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zjfvd" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.700517 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b13f13fe-a34f-4566-b0bd-31b326722b01","Type":"ContainerStarted","Data":"86b52ec1fc2638f2060ac390b84c54b562f8e17ca1522435f880018c286eb77c"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.701666 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd020a3-7f41-424d-acd4-0e06764fafb3","Type":"ContainerStarted","Data":"0225053ecdac0fdae29f01f3f868a5de82154857aac4be18069e0eac94d04d22"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.702811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3327c559-a028-4094-be53-cc5c7c116a6f","Type":"ContainerStarted","Data":"eef01a7c0ecdcb4e293f1c2b9665ad8bcc856184a4760bbdbe2c03115feef3b7"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.704108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" event={"ID":"733a7b61-b175-4381-ade8-91bd0714c2fa","Type":"ContainerStarted","Data":"28b0a5b68ff7915d7e6a331838e4f99fc8fb9da17b606aa386f1f279617b9e05"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.705330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0ea480ba-e1ea-47db-b647-39833517fcad","Type":"ContainerStarted","Data":"373a24b8dc419624e1ad77bea35a74d04da47d8aa84d7dea09d53cedd07b7d3b"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.706974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2" event={"ID":"5dbdee79-0740-4068-a155-e865fe787402","Type":"ContainerStarted","Data":"4c1038d3c86934400f05951f1092575e2b3ed26c8ed4a5a5922535e0b038846a"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.708384 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d7cf61af-2469-48d4-b3e9-77267e7d5328","Type":"ContainerStarted","Data":"ce655d28ec801f019be2ef846dd7bf816ea890e471117e23309a9eebfe89b778"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.710311 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.710372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5m7v2" event={"ID":"2915d797-6ffc-4282-b5a9-85aa4ef0e378","Type":"ContainerDied","Data":"c9b9237e25af14e58fa36742888511806960e258d8bd564e4d6dd10bb26fb2e4"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.712530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" event={"ID":"135425fd-b05d-4bce-97c2-a7ccc0f71a3e","Type":"ContainerStarted","Data":"61e51f048e0769de47fd4e6740c333c2cd13b8d80c320fd745ec15ba0e0f0d4b"} Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.779573 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.805193 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zjfvd"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.864357 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jd94b"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.873896 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:26 crc kubenswrapper[4755]: I1006 08:37:26.883640 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5m7v2"] Oct 06 08:37:27 crc kubenswrapper[4755]: W1006 08:37:27.140506 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a6dfcfa_6e0a_427c_88df_0619afb0195c.slice/crio-1607a53c48da30112aea10dba6b3e653a760167942181a76a0c90d37d454ad56 WatchSource:0}: Error finding container 1607a53c48da30112aea10dba6b3e653a760167942181a76a0c90d37d454ad56: Status 404 returned error can't find the container with id 1607a53c48da30112aea10dba6b3e653a760167942181a76a0c90d37d454ad56 Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.722949 4755 generic.go:334] "Generic (PLEG): container finished" podID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerID="c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5" exitCode=0 Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.723020 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" event={"ID":"135425fd-b05d-4bce-97c2-a7ccc0f71a3e","Type":"ContainerDied","Data":"c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5"} Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.724907 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jd94b" event={"ID":"5a6dfcfa-6e0a-427c-88df-0619afb0195c","Type":"ContainerStarted","Data":"1607a53c48da30112aea10dba6b3e653a760167942181a76a0c90d37d454ad56"} Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.729139 4755 generic.go:334] "Generic (PLEG): container finished" podID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerID="0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e" exitCode=0 Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.729232 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" event={"ID":"733a7b61-b175-4381-ade8-91bd0714c2fa","Type":"ContainerDied","Data":"0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e"} Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.887939 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2915d797-6ffc-4282-b5a9-85aa4ef0e378" path="/var/lib/kubelet/pods/2915d797-6ffc-4282-b5a9-85aa4ef0e378/volumes" Oct 06 08:37:27 crc kubenswrapper[4755]: I1006 08:37:27.888595 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d07c38-bd17-4b54-95ab-d13d53524497" path="/var/lib/kubelet/pods/b1d07c38-bd17-4b54-95ab-d13d53524497/volumes" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.794740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" event={"ID":"733a7b61-b175-4381-ade8-91bd0714c2fa","Type":"ContainerStarted","Data":"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.795638 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.800796 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d7cf61af-2469-48d4-b3e9-77267e7d5328","Type":"ContainerStarted","Data":"24324b9ea80b8c7e274a5dbb9cfcbf310bdc4c70447ce17d20c9fe4bb57879aa"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.802507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b13f13fe-a34f-4566-b0bd-31b326722b01","Type":"ContainerStarted","Data":"3d0b05fce3a088c601513dd25cad2d9503a63de5eccf9e3ed5babb986754e358"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.812970 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" podStartSLOduration=22.342636224 podStartE2EDuration="22.812950224s" podCreationTimestamp="2025-10-06 08:37:12 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.064019097 +0000 UTC m=+902.893334311" lastFinishedPulling="2025-10-06 08:37:26.534333097 +0000 UTC m=+903.363648311" observedRunningTime="2025-10-06 08:37:34.811671303 +0000 UTC m=+911.640986537" watchObservedRunningTime="2025-10-06 08:37:34.812950224 +0000 UTC m=+911.642265428" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.813007 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd020a3-7f41-424d-acd4-0e06764fafb3","Type":"ContainerStarted","Data":"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.813168 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.817507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" event={"ID":"135425fd-b05d-4bce-97c2-a7ccc0f71a3e","Type":"ContainerStarted","Data":"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.817596 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.820236 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jd94b" event={"ID":"5a6dfcfa-6e0a-427c-88df-0619afb0195c","Type":"ContainerStarted","Data":"6a43b21b98787ce385d9fc729b9824953daebe8313501ae7bf5763e367aa0ed9"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.822077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3327c559-a028-4094-be53-cc5c7c116a6f","Type":"ContainerStarted","Data":"28701dd679960dff9412c2bb9b65a3c5b49b61e23d1f4ac070b21fc026507ee9"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.823285 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0ea480ba-e1ea-47db-b647-39833517fcad","Type":"ContainerStarted","Data":"a83f72e25cacfb82d8f9e97b30f1caff842dfa967cdb20f9065b067ab0c316a1"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.825122 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2" event={"ID":"5dbdee79-0740-4068-a155-e865fe787402","Type":"ContainerStarted","Data":"a1da91a47ee444da0f5ed33f42e0623411505a02ed889c3f29a8ba2750123d4d"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.825627 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-b4rd2" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.832053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"70279926-92db-4788-b714-f14f60f4c55d","Type":"ContainerStarted","Data":"6bc651e93ebfe12d16bbd938892ed8dac4d288c14d0163455430441800211ac3"} Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.837302 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.875961 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.821467007 podStartE2EDuration="18.875941758s" podCreationTimestamp="2025-10-06 08:37:16 +0000 UTC" firstStartedPulling="2025-10-06 08:37:25.61113926 +0000 UTC m=+902.440454474" lastFinishedPulling="2025-10-06 08:37:32.665614011 +0000 UTC m=+909.494929225" observedRunningTime="2025-10-06 08:37:34.875769174 +0000 UTC m=+911.705084388" watchObservedRunningTime="2025-10-06 08:37:34.875941758 +0000 UTC m=+911.705256972" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.920491 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b4rd2" podStartSLOduration=6.22956303 podStartE2EDuration="13.920473563s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.010361404 +0000 UTC m=+902.839676618" lastFinishedPulling="2025-10-06 08:37:33.701271927 +0000 UTC m=+910.530587151" observedRunningTime="2025-10-06 08:37:34.917419899 +0000 UTC m=+911.746735153" watchObservedRunningTime="2025-10-06 08:37:34.920473563 +0000 UTC m=+911.749788777" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.961517 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" podStartSLOduration=22.44309851 podStartE2EDuration="22.961490092s" podCreationTimestamp="2025-10-06 08:37:12 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.012519213 +0000 UTC m=+902.841834427" lastFinishedPulling="2025-10-06 08:37:26.530910795 +0000 UTC m=+903.360226009" observedRunningTime="2025-10-06 08:37:34.942442537 +0000 UTC m=+911.771757751" watchObservedRunningTime="2025-10-06 08:37:34.961490092 +0000 UTC m=+911.790805306" Oct 06 08:37:34 crc kubenswrapper[4755]: I1006 08:37:34.962649 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.627671166 podStartE2EDuration="16.96263967s" podCreationTimestamp="2025-10-06 08:37:18 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.053191834 +0000 UTC m=+902.882507048" lastFinishedPulling="2025-10-06 08:37:34.388160328 +0000 UTC m=+911.217475552" observedRunningTime="2025-10-06 08:37:34.95687973 +0000 UTC m=+911.786194944" watchObservedRunningTime="2025-10-06 08:37:34.96263967 +0000 UTC m=+911.791954904" Oct 06 08:37:35 crc kubenswrapper[4755]: I1006 08:37:35.845312 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jd94b" event={"ID":"5a6dfcfa-6e0a-427c-88df-0619afb0195c","Type":"ContainerDied","Data":"6a43b21b98787ce385d9fc729b9824953daebe8313501ae7bf5763e367aa0ed9"} Oct 06 08:37:35 crc kubenswrapper[4755]: I1006 08:37:35.845268 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a6dfcfa-6e0a-427c-88df-0619afb0195c" containerID="6a43b21b98787ce385d9fc729b9824953daebe8313501ae7bf5763e367aa0ed9" exitCode=0 Oct 06 08:37:35 crc kubenswrapper[4755]: I1006 08:37:35.848364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerStarted","Data":"1633e3c5c5ebfc34e508071c9f8e1f1237359e4c454fd67af3224492420f4fbd"} Oct 06 08:37:35 crc kubenswrapper[4755]: I1006 08:37:35.853282 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerStarted","Data":"16177d3511ed44688be5cb711444e8d032e2e2c914042e75ecd13cca11fbce6d"} Oct 06 08:37:36 crc kubenswrapper[4755]: I1006 08:37:36.867169 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jd94b" event={"ID":"5a6dfcfa-6e0a-427c-88df-0619afb0195c","Type":"ContainerStarted","Data":"6819ae259dcf8243adb067e954e58910ee13446a5d69d5059f5edd3fe90b6fc3"} Oct 06 08:37:36 crc kubenswrapper[4755]: I1006 08:37:36.867498 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jd94b" event={"ID":"5a6dfcfa-6e0a-427c-88df-0619afb0195c","Type":"ContainerStarted","Data":"e2303e0159c1d85959c3804521eb180cd3a849f765aa0c9eedea4473bf2aaee2"} Oct 06 08:37:36 crc kubenswrapper[4755]: I1006 08:37:36.896854 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jd94b" podStartSLOduration=9.508700945 podStartE2EDuration="15.89683909s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:27.143788437 +0000 UTC m=+903.973103651" lastFinishedPulling="2025-10-06 08:37:33.531926572 +0000 UTC m=+910.361241796" observedRunningTime="2025-10-06 08:37:36.896061492 +0000 UTC m=+913.725376706" watchObservedRunningTime="2025-10-06 08:37:36.89683909 +0000 UTC m=+913.726154304" Oct 06 08:37:37 crc kubenswrapper[4755]: I1006 08:37:37.323139 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:37 crc kubenswrapper[4755]: I1006 08:37:37.323473 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:37:37 crc kubenswrapper[4755]: I1006 08:37:37.880853 4755 generic.go:334] "Generic (PLEG): container finished" podID="b13f13fe-a34f-4566-b0bd-31b326722b01" containerID="3d0b05fce3a088c601513dd25cad2d9503a63de5eccf9e3ed5babb986754e358" exitCode=0 Oct 06 08:37:37 crc kubenswrapper[4755]: I1006 08:37:37.888797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b13f13fe-a34f-4566-b0bd-31b326722b01","Type":"ContainerDied","Data":"3d0b05fce3a088c601513dd25cad2d9503a63de5eccf9e3ed5babb986754e358"} Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.890645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d7cf61af-2469-48d4-b3e9-77267e7d5328","Type":"ContainerStarted","Data":"8826e097933beb71121ffa4a33acb75202767f3663500b4a67e3efb570655354"} Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.892701 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b13f13fe-a34f-4566-b0bd-31b326722b01","Type":"ContainerStarted","Data":"b60eebb7642cdcef5ac2491ffdef3965d9f69e7fa11cb7acfdae6f90a8b9b414"} Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.894972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3327c559-a028-4094-be53-cc5c7c116a6f","Type":"ContainerStarted","Data":"e18385e5e740f3a1de1255a8e9c869a5699a3141d7cc509e36b8d3025f20f6e0"} Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.896503 4755 generic.go:334] "Generic (PLEG): container finished" podID="0ea480ba-e1ea-47db-b647-39833517fcad" containerID="a83f72e25cacfb82d8f9e97b30f1caff842dfa967cdb20f9065b067ab0c316a1" exitCode=0 Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.896547 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0ea480ba-e1ea-47db-b647-39833517fcad","Type":"ContainerDied","Data":"a83f72e25cacfb82d8f9e97b30f1caff842dfa967cdb20f9065b067ab0c316a1"} Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.924636 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.800180297 podStartE2EDuration="14.924617422s" podCreationTimestamp="2025-10-06 08:37:24 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.584333012 +0000 UTC m=+903.413648226" lastFinishedPulling="2025-10-06 08:37:37.708770137 +0000 UTC m=+914.538085351" observedRunningTime="2025-10-06 08:37:38.921036234 +0000 UTC m=+915.750351448" watchObservedRunningTime="2025-10-06 08:37:38.924617422 +0000 UTC m=+915.753932636" Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.954114 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.546608317 podStartE2EDuration="24.954086489s" podCreationTimestamp="2025-10-06 08:37:14 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.03126546 +0000 UTC m=+902.860580674" lastFinishedPulling="2025-10-06 08:37:33.438743592 +0000 UTC m=+910.268058846" observedRunningTime="2025-10-06 08:37:38.949268461 +0000 UTC m=+915.778583685" watchObservedRunningTime="2025-10-06 08:37:38.954086489 +0000 UTC m=+915.783401713" Oct 06 08:37:38 crc kubenswrapper[4755]: I1006 08:37:38.990660 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.43440764 podStartE2EDuration="16.990640189s" podCreationTimestamp="2025-10-06 08:37:22 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.164698684 +0000 UTC m=+902.994013898" lastFinishedPulling="2025-10-06 08:37:37.720931243 +0000 UTC m=+914.550246447" observedRunningTime="2025-10-06 08:37:38.988962369 +0000 UTC m=+915.818277573" watchObservedRunningTime="2025-10-06 08:37:38.990640189 +0000 UTC m=+915.819955403" Oct 06 08:37:39 crc kubenswrapper[4755]: I1006 08:37:39.912696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0ea480ba-e1ea-47db-b647-39833517fcad","Type":"ContainerStarted","Data":"b7462952a556f430d145a6c825172445662c29e0ec477207f0456b0a110696eb"} Oct 06 08:37:39 crc kubenswrapper[4755]: I1006 08:37:39.937326 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.27860536 podStartE2EDuration="24.937300887s" podCreationTimestamp="2025-10-06 08:37:15 +0000 UTC" firstStartedPulling="2025-10-06 08:37:26.0438278 +0000 UTC m=+902.873143014" lastFinishedPulling="2025-10-06 08:37:33.702523317 +0000 UTC m=+910.531838541" observedRunningTime="2025-10-06 08:37:39.93538047 +0000 UTC m=+916.764695724" watchObservedRunningTime="2025-10-06 08:37:39.937300887 +0000 UTC m=+916.766616101" Oct 06 08:37:40 crc kubenswrapper[4755]: I1006 08:37:40.943201 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:40 crc kubenswrapper[4755]: I1006 08:37:40.943248 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.002726 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.722054 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.762608 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.925532 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.964340 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:41 crc kubenswrapper[4755]: I1006 08:37:41.965504 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.097816 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.155287 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.155765 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="dnsmasq-dns" containerID="cri-o://0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2" gracePeriod=10 Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.157741 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.208622 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.210282 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.218549 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.221706 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-flm4c"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.223203 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.226410 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.247227 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.256179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-flm4c"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.344375 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-combined-ca-bundle\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345379 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345407 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dnq\" (UniqueName: \"kubernetes.io/projected/c01d8771-a0d8-436a-883d-5fb95dec3b59-kube-api-access-56dnq\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovs-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d8771-a0d8-436a-883d-5fb95dec3b59-config\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345540 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345581 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovn-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345611 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49bd\" (UniqueName: \"kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.345639 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.360770 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="dnsmasq-dns" containerID="cri-o://f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c" gracePeriod=10 Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.370677 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.371130 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.384687 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.391642 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-94vng" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.391897 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.392048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.392194 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.392323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.400886 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.402238 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.410914 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.436796 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.448871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.448929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.448978 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-combined-ca-bundle\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dnq\" (UniqueName: \"kubernetes.io/projected/c01d8771-a0d8-436a-883d-5fb95dec3b59-kube-api-access-56dnq\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovs-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d8771-a0d8-436a-883d-5fb95dec3b59-config\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-scripts\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovn-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49bd\" (UniqueName: \"kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-config\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449388 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449405 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlsfd\" (UniqueName: \"kubernetes.io/projected/8aa143d7-a987-43a1-992c-7b33b12710dd-kube-api-access-wlsfd\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.449814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovn-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.450425 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c01d8771-a0d8-436a-883d-5fb95dec3b59-ovs-rundir\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.450863 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.450930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.451007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d8771-a0d8-436a-883d-5fb95dec3b59-config\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.451461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.457238 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-combined-ca-bundle\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.458030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01d8771-a0d8-436a-883d-5fb95dec3b59-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.472973 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dnq\" (UniqueName: \"kubernetes.io/projected/c01d8771-a0d8-436a-883d-5fb95dec3b59-kube-api-access-56dnq\") pod \"ovn-controller-metrics-flm4c\" (UID: \"c01d8771-a0d8-436a-883d-5fb95dec3b59\") " pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.473820 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49bd\" (UniqueName: \"kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd\") pod \"dnsmasq-dns-7f896c8c65-xmdfl\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561365 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561439 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-config\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561503 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561522 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlsfd\" (UniqueName: \"kubernetes.io/projected/8aa143d7-a987-43a1-992c-7b33b12710dd-kube-api-access-wlsfd\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561653 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561679 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-scripts\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.561713 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnff\" (UniqueName: \"kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.562536 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-config\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.564314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.566217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8aa143d7-a987-43a1-992c-7b33b12710dd-scripts\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.569031 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.569713 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.570513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa143d7-a987-43a1-992c-7b33b12710dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.600487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlsfd\" (UniqueName: \"kubernetes.io/projected/8aa143d7-a987-43a1-992c-7b33b12710dd-kube-api-access-wlsfd\") pod \"ovn-northd-0\" (UID: \"8aa143d7-a987-43a1-992c-7b33b12710dd\") " pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.636712 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.650277 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-flm4c" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.664436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.664506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnff\" (UniqueName: \"kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.664556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.664594 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.664654 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.665495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.666040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.666867 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.667351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.696439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnff\" (UniqueName: \"kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff\") pod \"dnsmasq-dns-86db49b7ff-bsqzt\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.717512 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.729324 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.744140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.833661 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.884144 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc\") pod \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.884339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr6rk\" (UniqueName: \"kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk\") pod \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.884400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config\") pod \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\" (UID: \"135425fd-b05d-4bce-97c2-a7ccc0f71a3e\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.894699 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk" (OuterVolumeSpecName: "kube-api-access-wr6rk") pod "135425fd-b05d-4bce-97c2-a7ccc0f71a3e" (UID: "135425fd-b05d-4bce-97c2-a7ccc0f71a3e"). InnerVolumeSpecName "kube-api-access-wr6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.931977 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "135425fd-b05d-4bce-97c2-a7ccc0f71a3e" (UID: "135425fd-b05d-4bce-97c2-a7ccc0f71a3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.938330 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config" (OuterVolumeSpecName: "config") pod "135425fd-b05d-4bce-97c2-a7ccc0f71a3e" (UID: "135425fd-b05d-4bce-97c2-a7ccc0f71a3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.944952 4755 generic.go:334] "Generic (PLEG): container finished" podID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerID="0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2" exitCode=0 Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.945019 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.945050 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" event={"ID":"135425fd-b05d-4bce-97c2-a7ccc0f71a3e","Type":"ContainerDied","Data":"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2"} Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.945115 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s2p7n" event={"ID":"135425fd-b05d-4bce-97c2-a7ccc0f71a3e","Type":"ContainerDied","Data":"61e51f048e0769de47fd4e6740c333c2cd13b8d80c320fd745ec15ba0e0f0d4b"} Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.945140 4755 scope.go:117] "RemoveContainer" containerID="0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.949338 4755 generic.go:334] "Generic (PLEG): container finished" podID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerID="f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c" exitCode=0 Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.950534 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.950914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" event={"ID":"733a7b61-b175-4381-ade8-91bd0714c2fa","Type":"ContainerDied","Data":"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c"} Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.950946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" event={"ID":"733a7b61-b175-4381-ade8-91bd0714c2fa","Type":"ContainerDied","Data":"28b0a5b68ff7915d7e6a331838e4f99fc8fb9da17b606aa386f1f279617b9e05"} Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.980375 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.982376 4755 scope.go:117] "RemoveContainer" containerID="c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.985873 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tbzt\" (UniqueName: \"kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt\") pod \"733a7b61-b175-4381-ade8-91bd0714c2fa\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.985989 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config\") pod \"733a7b61-b175-4381-ade8-91bd0714c2fa\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.986112 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc\") pod \"733a7b61-b175-4381-ade8-91bd0714c2fa\" (UID: \"733a7b61-b175-4381-ade8-91bd0714c2fa\") " Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.986900 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.986922 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.986932 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr6rk\" (UniqueName: \"kubernetes.io/projected/135425fd-b05d-4bce-97c2-a7ccc0f71a3e-kube-api-access-wr6rk\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.988865 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s2p7n"] Oct 06 08:37:42 crc kubenswrapper[4755]: I1006 08:37:42.991649 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt" (OuterVolumeSpecName: "kube-api-access-6tbzt") pod "733a7b61-b175-4381-ade8-91bd0714c2fa" (UID: "733a7b61-b175-4381-ade8-91bd0714c2fa"). InnerVolumeSpecName "kube-api-access-6tbzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.000136 4755 scope.go:117] "RemoveContainer" containerID="0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2" Oct 06 08:37:43 crc kubenswrapper[4755]: E1006 08:37:43.000973 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2\": container with ID starting with 0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2 not found: ID does not exist" containerID="0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.001006 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2"} err="failed to get container status \"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2\": rpc error: code = NotFound desc = could not find container \"0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2\": container with ID starting with 0d2ec604487f516fbff58c88465c10533e1e5b38088e9834904cab31ca7d4ff2 not found: ID does not exist" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.001036 4755 scope.go:117] "RemoveContainer" containerID="c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5" Oct 06 08:37:43 crc kubenswrapper[4755]: E1006 08:37:43.001266 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5\": container with ID starting with c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5 not found: ID does not exist" containerID="c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.001292 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5"} err="failed to get container status \"c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5\": rpc error: code = NotFound desc = could not find container \"c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5\": container with ID starting with c820206b68649e85fa707af0bfe1f1d2e13c43a33c9030814d4ae180d574eac5 not found: ID does not exist" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.001306 4755 scope.go:117] "RemoveContainer" containerID="f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.021946 4755 scope.go:117] "RemoveContainer" containerID="0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.026191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "733a7b61-b175-4381-ade8-91bd0714c2fa" (UID: "733a7b61-b175-4381-ade8-91bd0714c2fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.031850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config" (OuterVolumeSpecName: "config") pod "733a7b61-b175-4381-ade8-91bd0714c2fa" (UID: "733a7b61-b175-4381-ade8-91bd0714c2fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.048651 4755 scope.go:117] "RemoveContainer" containerID="f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c" Oct 06 08:37:43 crc kubenswrapper[4755]: E1006 08:37:43.049118 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c\": container with ID starting with f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c not found: ID does not exist" containerID="f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.049158 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c"} err="failed to get container status \"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c\": rpc error: code = NotFound desc = could not find container \"f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c\": container with ID starting with f8d4e4c6713b9ff55b73dafd539b33756e35e0ffc41b4d9911ef997e3dbc223c not found: ID does not exist" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.049193 4755 scope.go:117] "RemoveContainer" containerID="0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e" Oct 06 08:37:43 crc kubenswrapper[4755]: E1006 08:37:43.049514 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e\": container with ID starting with 0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e not found: ID does not exist" containerID="0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.049561 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e"} err="failed to get container status \"0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e\": rpc error: code = NotFound desc = could not find container \"0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e\": container with ID starting with 0d53fc0f5c82899dca17046af4f7a832765e3f1c900870ff846473b2db0bbb8e not found: ID does not exist" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.090801 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.090832 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733a7b61-b175-4381-ade8-91bd0714c2fa-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.090842 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tbzt\" (UniqueName: \"kubernetes.io/projected/733a7b61-b175-4381-ade8-91bd0714c2fa-kube-api-access-6tbzt\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.178669 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-flm4c"] Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.218833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.280142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:43 crc kubenswrapper[4755]: W1006 08:37:43.302557 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f486fd_fc13_4344_8355_ffe4903af499.slice/crio-04aaaadd7e9fd65e0c09b5d3d186de3c1f4b03e657b47cb5fc144df1b1931bbd WatchSource:0}: Error finding container 04aaaadd7e9fd65e0c09b5d3d186de3c1f4b03e657b47cb5fc144df1b1931bbd: Status 404 returned error can't find the container with id 04aaaadd7e9fd65e0c09b5d3d186de3c1f4b03e657b47cb5fc144df1b1931bbd Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.354928 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:43 crc kubenswrapper[4755]: W1006 08:37:43.364268 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aa143d7_a987_43a1_992c_7b33b12710dd.slice/crio-aa360aa95eb15fe14f89e3ee2c3223965f7d767d32e0f8fbc47bd3ec3afa71d0 WatchSource:0}: Error finding container aa360aa95eb15fe14f89e3ee2c3223965f7d767d32e0f8fbc47bd3ec3afa71d0: Status 404 returned error can't find the container with id aa360aa95eb15fe14f89e3ee2c3223965f7d767d32e0f8fbc47bd3ec3afa71d0 Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.451724 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.457106 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pwl7d"] Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.895793 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" path="/var/lib/kubelet/pods/135425fd-b05d-4bce-97c2-a7ccc0f71a3e/volumes" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.896882 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" path="/var/lib/kubelet/pods/733a7b61-b175-4381-ade8-91bd0714c2fa/volumes" Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.967247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8aa143d7-a987-43a1-992c-7b33b12710dd","Type":"ContainerStarted","Data":"aa360aa95eb15fe14f89e3ee2c3223965f7d767d32e0f8fbc47bd3ec3afa71d0"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.971288 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8f486fd-fc13-4344-8355-ffe4903af499" containerID="689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589" exitCode=0 Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.971602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" event={"ID":"b8f486fd-fc13-4344-8355-ffe4903af499","Type":"ContainerDied","Data":"689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.971720 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" event={"ID":"b8f486fd-fc13-4344-8355-ffe4903af499","Type":"ContainerStarted","Data":"04aaaadd7e9fd65e0c09b5d3d186de3c1f4b03e657b47cb5fc144df1b1931bbd"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.976990 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-flm4c" event={"ID":"c01d8771-a0d8-436a-883d-5fb95dec3b59","Type":"ContainerStarted","Data":"81da8b9849685b85aba3a604da473924683cc74c67987022c3012e96eb189d25"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.977042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-flm4c" event={"ID":"c01d8771-a0d8-436a-883d-5fb95dec3b59","Type":"ContainerStarted","Data":"a6b86633e850c81ebb94a06044cc4de283ab2e818a0af9cb69c6ba1a7b7ad79e"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.980940 4755 generic.go:334] "Generic (PLEG): container finished" podID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerID="b9fc1bd844a85c6943ae59878d822af703cec012b06a7c1fb6da58cc1a0b534e" exitCode=0 Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.980992 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" event={"ID":"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b","Type":"ContainerDied","Data":"b9fc1bd844a85c6943ae59878d822af703cec012b06a7c1fb6da58cc1a0b534e"} Oct 06 08:37:43 crc kubenswrapper[4755]: I1006 08:37:43.981014 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" event={"ID":"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b","Type":"ContainerStarted","Data":"0cdc1b48cdace607d3602009e4879941b71b53c4d86c31fc5025fde3d8212e39"} Oct 06 08:37:44 crc kubenswrapper[4755]: I1006 08:37:44.116667 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-flm4c" podStartSLOduration=2.116648354 podStartE2EDuration="2.116648354s" podCreationTimestamp="2025-10-06 08:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:44.036672106 +0000 UTC m=+920.865987320" watchObservedRunningTime="2025-10-06 08:37:44.116648354 +0000 UTC m=+920.945963568" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.004549 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8aa143d7-a987-43a1-992c-7b33b12710dd","Type":"ContainerStarted","Data":"6fb689288ef282dcdb61f3fd40329a9e059bd9725dfbafde293d814d716fc3e4"} Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.006271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" event={"ID":"b8f486fd-fc13-4344-8355-ffe4903af499","Type":"ContainerStarted","Data":"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16"} Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.006770 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.009101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" event={"ID":"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b","Type":"ContainerStarted","Data":"6474e483a747d46e7481bb383840eec31fc58ad2ead94f9e53a095811837a5a3"} Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.026981 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" podStartSLOduration=3.026957376 podStartE2EDuration="3.026957376s" podCreationTimestamp="2025-10-06 08:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:45.026201147 +0000 UTC m=+921.855516361" watchObservedRunningTime="2025-10-06 08:37:45.026957376 +0000 UTC m=+921.856272590" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.046964 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podStartSLOduration=3.046948662 podStartE2EDuration="3.046948662s" podCreationTimestamp="2025-10-06 08:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:45.044479282 +0000 UTC m=+921.873794506" watchObservedRunningTime="2025-10-06 08:37:45.046948662 +0000 UTC m=+921.876263866" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.729997 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.730353 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 08:37:45 crc kubenswrapper[4755]: I1006 08:37:45.793039 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 08:37:46 crc kubenswrapper[4755]: I1006 08:37:46.017111 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8aa143d7-a987-43a1-992c-7b33b12710dd","Type":"ContainerStarted","Data":"efebbdae3a8ba55f51e8d1cf96bd6fce7a1460ff35a3649d30779fa475cb5487"} Oct 06 08:37:46 crc kubenswrapper[4755]: I1006 08:37:46.018497 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:46 crc kubenswrapper[4755]: I1006 08:37:46.038400 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.664165299 podStartE2EDuration="4.03837227s" podCreationTimestamp="2025-10-06 08:37:42 +0000 UTC" firstStartedPulling="2025-10-06 08:37:43.366221105 +0000 UTC m=+920.195536319" lastFinishedPulling="2025-10-06 08:37:44.740428076 +0000 UTC m=+921.569743290" observedRunningTime="2025-10-06 08:37:46.032294983 +0000 UTC m=+922.861610237" watchObservedRunningTime="2025-10-06 08:37:46.03837227 +0000 UTC m=+922.867687474" Oct 06 08:37:46 crc kubenswrapper[4755]: I1006 08:37:46.066288 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.706280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.707579 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.750817 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.816956 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pbf24"] Oct 06 08:37:48 crc kubenswrapper[4755]: E1006 08:37:46.817246 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817257 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: E1006 08:37:46.817269 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817276 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: E1006 08:37:46.817325 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="init" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817331 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="init" Oct 06 08:37:48 crc kubenswrapper[4755]: E1006 08:37:46.817339 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="init" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817344 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="init" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817480 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="135425fd-b05d-4bce-97c2-a7ccc0f71a3e" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.817502 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="dnsmasq-dns" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.818043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.831258 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pbf24"] Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.949929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxfw\" (UniqueName: \"kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw\") pod \"keystone-db-create-pbf24\" (UID: \"0214974c-3ea9-468d-84dc-a941cddf9f94\") " pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.981527 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xr272"] Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.982687 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xr272" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:46.991695 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xr272"] Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.024058 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.051870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxfw\" (UniqueName: \"kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw\") pod \"keystone-db-create-pbf24\" (UID: \"0214974c-3ea9-468d-84dc-a941cddf9f94\") " pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.051914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n888\" (UniqueName: \"kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888\") pod \"placement-db-create-xr272\" (UID: \"9edaa26f-c908-44f8-92ea-48f25d7febc3\") " pod="openstack/placement-db-create-xr272" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.067931 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.071733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxfw\" (UniqueName: \"kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw\") pod \"keystone-db-create-pbf24\" (UID: \"0214974c-3ea9-468d-84dc-a941cddf9f94\") " pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.142908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.153336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n888\" (UniqueName: \"kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888\") pod \"placement-db-create-xr272\" (UID: \"9edaa26f-c908-44f8-92ea-48f25d7febc3\") " pod="openstack/placement-db-create-xr272" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.175664 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n888\" (UniqueName: \"kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888\") pod \"placement-db-create-xr272\" (UID: \"9edaa26f-c908-44f8-92ea-48f25d7febc3\") " pod="openstack/placement-db-create-xr272" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.306279 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xr272" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:47.763780 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-pwl7d" podUID="733a7b61-b175-4381-ade8-91bd0714c2fa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: i/o timeout" Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:48.717805 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pbf24"] Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:48.732938 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xr272"] Oct 06 08:37:48 crc kubenswrapper[4755]: I1006 08:37:48.908534 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 08:37:49 crc kubenswrapper[4755]: I1006 08:37:49.041039 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbf24" event={"ID":"0214974c-3ea9-468d-84dc-a941cddf9f94","Type":"ContainerStarted","Data":"3050ecbac027772d04ee0b056f17ba3561b82ad78a50db56d9c617d68763bcd6"} Oct 06 08:37:49 crc kubenswrapper[4755]: I1006 08:37:49.042235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xr272" event={"ID":"9edaa26f-c908-44f8-92ea-48f25d7febc3","Type":"ContainerStarted","Data":"ec1e2caaa139dd1c9b8abb65a8308301250e48f57a25141f3333beba17f0e9de"} Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.068947 4755 generic.go:334] "Generic (PLEG): container finished" podID="9edaa26f-c908-44f8-92ea-48f25d7febc3" containerID="3db8da0f0690dd6cb2ab0504c0c382d8510a60669bd723374d03c10babb7d06a" exitCode=0 Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.069001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xr272" event={"ID":"9edaa26f-c908-44f8-92ea-48f25d7febc3","Type":"ContainerDied","Data":"3db8da0f0690dd6cb2ab0504c0c382d8510a60669bd723374d03c10babb7d06a"} Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.071410 4755 generic.go:334] "Generic (PLEG): container finished" podID="0214974c-3ea9-468d-84dc-a941cddf9f94" containerID="e0d49d956ef0dd0a1d1255ab6ca8ec652b741bcba585ce9cccff8a872551c566" exitCode=0 Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.071447 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbf24" event={"ID":"0214974c-3ea9-468d-84dc-a941cddf9f94","Type":"ContainerDied","Data":"e0d49d956ef0dd0a1d1255ab6ca8ec652b741bcba585ce9cccff8a872551c566"} Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.316717 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sjq7l"] Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.319710 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.330292 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sjq7l"] Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.439100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697dc\" (UniqueName: \"kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc\") pod \"glance-db-create-sjq7l\" (UID: \"3254a8f8-f719-442c-b1a0-31a59dad705e\") " pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.540357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697dc\" (UniqueName: \"kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc\") pod \"glance-db-create-sjq7l\" (UID: \"3254a8f8-f719-442c-b1a0-31a59dad705e\") " pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.557965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697dc\" (UniqueName: \"kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc\") pod \"glance-db-create-sjq7l\" (UID: \"3254a8f8-f719-442c-b1a0-31a59dad705e\") " pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.638729 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.646907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.753298 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:37:52 crc kubenswrapper[4755]: I1006 08:37:52.809584 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.079133 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="dnsmasq-dns" containerID="cri-o://8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16" gracePeriod=10 Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.127132 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sjq7l"] Oct 06 08:37:53 crc kubenswrapper[4755]: W1006 08:37:53.169807 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3254a8f8_f719_442c_b1a0_31a59dad705e.slice/crio-a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc WatchSource:0}: Error finding container a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc: Status 404 returned error can't find the container with id a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.426432 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.475105 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xr272" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.545011 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.605088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n888\" (UniqueName: \"kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888\") pod \"9edaa26f-c908-44f8-92ea-48f25d7febc3\" (UID: \"9edaa26f-c908-44f8-92ea-48f25d7febc3\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.605226 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxfw\" (UniqueName: \"kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw\") pod \"0214974c-3ea9-468d-84dc-a941cddf9f94\" (UID: \"0214974c-3ea9-468d-84dc-a941cddf9f94\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.614795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw" (OuterVolumeSpecName: "kube-api-access-kgxfw") pod "0214974c-3ea9-468d-84dc-a941cddf9f94" (UID: "0214974c-3ea9-468d-84dc-a941cddf9f94"). InnerVolumeSpecName "kube-api-access-kgxfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.620748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888" (OuterVolumeSpecName: "kube-api-access-8n888") pod "9edaa26f-c908-44f8-92ea-48f25d7febc3" (UID: "9edaa26f-c908-44f8-92ea-48f25d7febc3"). InnerVolumeSpecName "kube-api-access-8n888". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.706701 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb\") pod \"b8f486fd-fc13-4344-8355-ffe4903af499\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.706761 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config\") pod \"b8f486fd-fc13-4344-8355-ffe4903af499\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.706873 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc\") pod \"b8f486fd-fc13-4344-8355-ffe4903af499\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.706900 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49bd\" (UniqueName: \"kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd\") pod \"b8f486fd-fc13-4344-8355-ffe4903af499\" (UID: \"b8f486fd-fc13-4344-8355-ffe4903af499\") " Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.707193 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgxfw\" (UniqueName: \"kubernetes.io/projected/0214974c-3ea9-468d-84dc-a941cddf9f94-kube-api-access-kgxfw\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.707212 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n888\" (UniqueName: \"kubernetes.io/projected/9edaa26f-c908-44f8-92ea-48f25d7febc3-kube-api-access-8n888\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.711492 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd" (OuterVolumeSpecName: "kube-api-access-b49bd") pod "b8f486fd-fc13-4344-8355-ffe4903af499" (UID: "b8f486fd-fc13-4344-8355-ffe4903af499"). InnerVolumeSpecName "kube-api-access-b49bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.796075 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config" (OuterVolumeSpecName: "config") pod "b8f486fd-fc13-4344-8355-ffe4903af499" (UID: "b8f486fd-fc13-4344-8355-ffe4903af499"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.797158 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8f486fd-fc13-4344-8355-ffe4903af499" (UID: "b8f486fd-fc13-4344-8355-ffe4903af499"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.808347 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.808373 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.808382 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49bd\" (UniqueName: \"kubernetes.io/projected/b8f486fd-fc13-4344-8355-ffe4903af499-kube-api-access-b49bd\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.828272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8f486fd-fc13-4344-8355-ffe4903af499" (UID: "b8f486fd-fc13-4344-8355-ffe4903af499"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:53 crc kubenswrapper[4755]: I1006 08:37:53.909273 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f486fd-fc13-4344-8355-ffe4903af499-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.088931 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xr272" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.089300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xr272" event={"ID":"9edaa26f-c908-44f8-92ea-48f25d7febc3","Type":"ContainerDied","Data":"ec1e2caaa139dd1c9b8abb65a8308301250e48f57a25141f3333beba17f0e9de"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.089341 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec1e2caaa139dd1c9b8abb65a8308301250e48f57a25141f3333beba17f0e9de" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.090746 4755 generic.go:334] "Generic (PLEG): container finished" podID="3254a8f8-f719-442c-b1a0-31a59dad705e" containerID="dc7672cfba047a76274ca9505095c74182f23f1290799700674b7119fe58d9b2" exitCode=0 Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.090817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sjq7l" event={"ID":"3254a8f8-f719-442c-b1a0-31a59dad705e","Type":"ContainerDied","Data":"dc7672cfba047a76274ca9505095c74182f23f1290799700674b7119fe58d9b2"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.090857 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sjq7l" event={"ID":"3254a8f8-f719-442c-b1a0-31a59dad705e","Type":"ContainerStarted","Data":"a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.093341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbf24" event={"ID":"0214974c-3ea9-468d-84dc-a941cddf9f94","Type":"ContainerDied","Data":"3050ecbac027772d04ee0b056f17ba3561b82ad78a50db56d9c617d68763bcd6"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.093374 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3050ecbac027772d04ee0b056f17ba3561b82ad78a50db56d9c617d68763bcd6" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.093391 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbf24" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.095811 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8f486fd-fc13-4344-8355-ffe4903af499" containerID="8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16" exitCode=0 Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.095848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" event={"ID":"b8f486fd-fc13-4344-8355-ffe4903af499","Type":"ContainerDied","Data":"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.095870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" event={"ID":"b8f486fd-fc13-4344-8355-ffe4903af499","Type":"ContainerDied","Data":"04aaaadd7e9fd65e0c09b5d3d186de3c1f4b03e657b47cb5fc144df1b1931bbd"} Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.095890 4755 scope.go:117] "RemoveContainer" containerID="8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.095902 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xmdfl" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.121961 4755 scope.go:117] "RemoveContainer" containerID="689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.128382 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.134195 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xmdfl"] Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.141443 4755 scope.go:117] "RemoveContainer" containerID="8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16" Oct 06 08:37:54 crc kubenswrapper[4755]: E1006 08:37:54.142171 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16\": container with ID starting with 8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16 not found: ID does not exist" containerID="8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.142218 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16"} err="failed to get container status \"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16\": rpc error: code = NotFound desc = could not find container \"8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16\": container with ID starting with 8d14750935a62fcd5d7a4b129f22432001f2cfbdd5dd6a30e6d3238d79b83a16 not found: ID does not exist" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.142250 4755 scope.go:117] "RemoveContainer" containerID="689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589" Oct 06 08:37:54 crc kubenswrapper[4755]: E1006 08:37:54.142699 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589\": container with ID starting with 689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589 not found: ID does not exist" containerID="689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589" Oct 06 08:37:54 crc kubenswrapper[4755]: I1006 08:37:54.142727 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589"} err="failed to get container status \"689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589\": rpc error: code = NotFound desc = could not find container \"689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589\": container with ID starting with 689049c388362c971ad4b02e28dcb9d2062278e0456e1c0d329df674eba35589 not found: ID does not exist" Oct 06 08:37:55 crc kubenswrapper[4755]: I1006 08:37:55.454429 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:55 crc kubenswrapper[4755]: I1006 08:37:55.639765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-697dc\" (UniqueName: \"kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc\") pod \"3254a8f8-f719-442c-b1a0-31a59dad705e\" (UID: \"3254a8f8-f719-442c-b1a0-31a59dad705e\") " Oct 06 08:37:55 crc kubenswrapper[4755]: I1006 08:37:55.647688 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc" (OuterVolumeSpecName: "kube-api-access-697dc") pod "3254a8f8-f719-442c-b1a0-31a59dad705e" (UID: "3254a8f8-f719-442c-b1a0-31a59dad705e"). InnerVolumeSpecName "kube-api-access-697dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:55 crc kubenswrapper[4755]: I1006 08:37:55.741631 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-697dc\" (UniqueName: \"kubernetes.io/projected/3254a8f8-f719-442c-b1a0-31a59dad705e-kube-api-access-697dc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:55 crc kubenswrapper[4755]: I1006 08:37:55.895125 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" path="/var/lib/kubelet/pods/b8f486fd-fc13-4344-8355-ffe4903af499/volumes" Oct 06 08:37:56 crc kubenswrapper[4755]: I1006 08:37:56.135300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sjq7l" event={"ID":"3254a8f8-f719-442c-b1a0-31a59dad705e","Type":"ContainerDied","Data":"a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc"} Oct 06 08:37:56 crc kubenswrapper[4755]: I1006 08:37:56.135350 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f2749d1215e8f7ff07a00d0fdfab8d44da78d55e241dc839647b159eaef9dc" Oct 06 08:37:56 crc kubenswrapper[4755]: I1006 08:37:56.135347 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sjq7l" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.109769 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1e0a-account-create-ljjqb"] Oct 06 08:37:57 crc kubenswrapper[4755]: E1006 08:37:57.110288 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="dnsmasq-dns" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110299 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="dnsmasq-dns" Oct 06 08:37:57 crc kubenswrapper[4755]: E1006 08:37:57.110388 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0214974c-3ea9-468d-84dc-a941cddf9f94" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110398 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0214974c-3ea9-468d-84dc-a941cddf9f94" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: E1006 08:37:57.110412 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edaa26f-c908-44f8-92ea-48f25d7febc3" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110419 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edaa26f-c908-44f8-92ea-48f25d7febc3" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: E1006 08:37:57.110428 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3254a8f8-f719-442c-b1a0-31a59dad705e" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110434 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3254a8f8-f719-442c-b1a0-31a59dad705e" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: E1006 08:37:57.110443 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="init" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110448 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="init" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110609 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f486fd-fc13-4344-8355-ffe4903af499" containerName="dnsmasq-dns" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110620 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edaa26f-c908-44f8-92ea-48f25d7febc3" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110635 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3254a8f8-f719-442c-b1a0-31a59dad705e" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.110645 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0214974c-3ea9-468d-84dc-a941cddf9f94" containerName="mariadb-database-create" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.111073 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.113271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.120970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e0a-account-create-ljjqb"] Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.270328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ln7\" (UniqueName: \"kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7\") pod \"placement-1e0a-account-create-ljjqb\" (UID: \"3c5ce319-fe48-4954-afae-bf595efca444\") " pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.371944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ln7\" (UniqueName: \"kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7\") pod \"placement-1e0a-account-create-ljjqb\" (UID: \"3c5ce319-fe48-4954-afae-bf595efca444\") " pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.397332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ln7\" (UniqueName: \"kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7\") pod \"placement-1e0a-account-create-ljjqb\" (UID: \"3c5ce319-fe48-4954-afae-bf595efca444\") " pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.429509 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.815616 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 08:37:57 crc kubenswrapper[4755]: I1006 08:37:57.862610 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e0a-account-create-ljjqb"] Oct 06 08:37:57 crc kubenswrapper[4755]: W1006 08:37:57.873163 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c5ce319_fe48_4954_afae_bf595efca444.slice/crio-71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92 WatchSource:0}: Error finding container 71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92: Status 404 returned error can't find the container with id 71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92 Oct 06 08:37:58 crc kubenswrapper[4755]: I1006 08:37:58.152124 4755 generic.go:334] "Generic (PLEG): container finished" podID="3c5ce319-fe48-4954-afae-bf595efca444" containerID="21339d98c34787c448e9ed0f53bd4cd06bada5f4decadcc27caacb6438473fa5" exitCode=0 Oct 06 08:37:58 crc kubenswrapper[4755]: I1006 08:37:58.152176 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e0a-account-create-ljjqb" event={"ID":"3c5ce319-fe48-4954-afae-bf595efca444","Type":"ContainerDied","Data":"21339d98c34787c448e9ed0f53bd4cd06bada5f4decadcc27caacb6438473fa5"} Oct 06 08:37:58 crc kubenswrapper[4755]: I1006 08:37:58.152326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e0a-account-create-ljjqb" event={"ID":"3c5ce319-fe48-4954-afae-bf595efca444","Type":"ContainerStarted","Data":"71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92"} Oct 06 08:37:59 crc kubenswrapper[4755]: I1006 08:37:59.485416 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:37:59 crc kubenswrapper[4755]: I1006 08:37:59.617851 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2ln7\" (UniqueName: \"kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7\") pod \"3c5ce319-fe48-4954-afae-bf595efca444\" (UID: \"3c5ce319-fe48-4954-afae-bf595efca444\") " Oct 06 08:37:59 crc kubenswrapper[4755]: I1006 08:37:59.624843 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7" (OuterVolumeSpecName: "kube-api-access-l2ln7") pod "3c5ce319-fe48-4954-afae-bf595efca444" (UID: "3c5ce319-fe48-4954-afae-bf595efca444"). InnerVolumeSpecName "kube-api-access-l2ln7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:59 crc kubenswrapper[4755]: I1006 08:37:59.719944 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2ln7\" (UniqueName: \"kubernetes.io/projected/3c5ce319-fe48-4954-afae-bf595efca444-kube-api-access-l2ln7\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:00 crc kubenswrapper[4755]: I1006 08:38:00.166600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e0a-account-create-ljjqb" event={"ID":"3c5ce319-fe48-4954-afae-bf595efca444","Type":"ContainerDied","Data":"71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92"} Oct 06 08:38:00 crc kubenswrapper[4755]: I1006 08:38:00.166641 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71aae9ba3c05c850bd2eca81a13477e5b38b4be233917f4cee0bb72413e86d92" Oct 06 08:38:00 crc kubenswrapper[4755]: I1006 08:38:00.166615 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e0a-account-create-ljjqb" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.467178 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1de5-account-create-hkvn2"] Oct 06 08:38:02 crc kubenswrapper[4755]: E1006 08:38:02.467916 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5ce319-fe48-4954-afae-bf595efca444" containerName="mariadb-account-create" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.467933 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5ce319-fe48-4954-afae-bf595efca444" containerName="mariadb-account-create" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.468173 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5ce319-fe48-4954-afae-bf595efca444" containerName="mariadb-account-create" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.468844 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.471385 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.476095 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1de5-account-create-hkvn2"] Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.664235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwbq\" (UniqueName: \"kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq\") pod \"glance-1de5-account-create-hkvn2\" (UID: \"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7\") " pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.766925 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwbq\" (UniqueName: \"kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq\") pod \"glance-1de5-account-create-hkvn2\" (UID: \"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7\") " pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.785400 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwbq\" (UniqueName: \"kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq\") pod \"glance-1de5-account-create-hkvn2\" (UID: \"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7\") " pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:02 crc kubenswrapper[4755]: I1006 08:38:02.830138 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:03 crc kubenswrapper[4755]: I1006 08:38:03.278546 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1de5-account-create-hkvn2"] Oct 06 08:38:03 crc kubenswrapper[4755]: W1006 08:38:03.282950 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe22dcc_4c4c_43fb_9cc3_5968481dcbb7.slice/crio-9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d WatchSource:0}: Error finding container 9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d: Status 404 returned error can't find the container with id 9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d Oct 06 08:38:04 crc kubenswrapper[4755]: I1006 08:38:04.195107 4755 generic.go:334] "Generic (PLEG): container finished" podID="abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" containerID="9c47c89b198612f8186c7879c23ee531f5979ab794ada6c97e91dbea021675c6" exitCode=0 Oct 06 08:38:04 crc kubenswrapper[4755]: I1006 08:38:04.195184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1de5-account-create-hkvn2" event={"ID":"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7","Type":"ContainerDied","Data":"9c47c89b198612f8186c7879c23ee531f5979ab794ada6c97e91dbea021675c6"} Oct 06 08:38:04 crc kubenswrapper[4755]: I1006 08:38:04.195514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1de5-account-create-hkvn2" event={"ID":"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7","Type":"ContainerStarted","Data":"9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d"} Oct 06 08:38:05 crc kubenswrapper[4755]: I1006 08:38:05.504362 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:05 crc kubenswrapper[4755]: I1006 08:38:05.511846 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mwbq\" (UniqueName: \"kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq\") pod \"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7\" (UID: \"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7\") " Oct 06 08:38:05 crc kubenswrapper[4755]: I1006 08:38:05.517752 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq" (OuterVolumeSpecName: "kube-api-access-2mwbq") pod "abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" (UID: "abe22dcc-4c4c-43fb-9cc3-5968481dcbb7"). InnerVolumeSpecName "kube-api-access-2mwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:05 crc kubenswrapper[4755]: I1006 08:38:05.613596 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mwbq\" (UniqueName: \"kubernetes.io/projected/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7-kube-api-access-2mwbq\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.211150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1de5-account-create-hkvn2" event={"ID":"abe22dcc-4c4c-43fb-9cc3-5968481dcbb7","Type":"ContainerDied","Data":"9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d"} Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.211183 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1de5-account-create-hkvn2" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.211189 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e5c73d30a7dc7c4d6e853e548c52ef9d7c3f615282ea64c73f32a942d6f610d" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.841293 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-164f-account-create-7fnv7"] Oct 06 08:38:06 crc kubenswrapper[4755]: E1006 08:38:06.843762 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" containerName="mariadb-account-create" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.843804 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" containerName="mariadb-account-create" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.844092 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" containerName="mariadb-account-create" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.847335 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.849544 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 08:38:06 crc kubenswrapper[4755]: I1006 08:38:06.854080 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-164f-account-create-7fnv7"] Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.032756 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bsc\" (UniqueName: \"kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc\") pod \"keystone-164f-account-create-7fnv7\" (UID: \"98b7aae3-9795-47d6-89db-2e4f91af9a0e\") " pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.134013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bsc\" (UniqueName: \"kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc\") pod \"keystone-164f-account-create-7fnv7\" (UID: \"98b7aae3-9795-47d6-89db-2e4f91af9a0e\") " pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.153218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bsc\" (UniqueName: \"kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc\") pod \"keystone-164f-account-create-7fnv7\" (UID: \"98b7aae3-9795-47d6-89db-2e4f91af9a0e\") " pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.174173 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.350139 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b4rd2" podUID="5dbdee79-0740-4068-a155-e865fe787402" containerName="ovn-controller" probeResult="failure" output=< Oct 06 08:38:07 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 08:38:07 crc kubenswrapper[4755]: > Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.366713 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.372975 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jd94b" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.586627 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b4rd2-config-66pd2"] Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.587875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.596639 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.596763 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2-config-66pd2"] Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.629207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-164f-account-create-7fnv7"] Oct 06 08:38:07 crc kubenswrapper[4755]: W1006 08:38:07.638018 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b7aae3_9795_47d6_89db_2e4f91af9a0e.slice/crio-a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61 WatchSource:0}: Error finding container a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61: Status 404 returned error can't find the container with id a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61 Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.701571 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fh9p5"] Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.702740 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.704993 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.705231 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-psxww" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.707985 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fh9p5"] Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kzf\" (UniqueName: \"kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743365 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743382 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.743398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844475 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmqhg\" (UniqueName: \"kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844657 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844744 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kzf\" (UniqueName: \"kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.844819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.845028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.845055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.845110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.845439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.847379 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.866693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kzf\" (UniqueName: \"kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf\") pod \"ovn-controller-b4rd2-config-66pd2\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.904887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.945838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.945893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.946010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmqhg\" (UniqueName: \"kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.946051 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.950264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.950382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.953061 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:07 crc kubenswrapper[4755]: I1006 08:38:07.966074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmqhg\" (UniqueName: \"kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg\") pod \"glance-db-sync-fh9p5\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.081828 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.237842 4755 generic.go:334] "Generic (PLEG): container finished" podID="98b7aae3-9795-47d6-89db-2e4f91af9a0e" containerID="6099ba2225afc605e432fc08f46d189f6b95b425f091591fe2c4aad0d5e20131" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.237946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-164f-account-create-7fnv7" event={"ID":"98b7aae3-9795-47d6-89db-2e4f91af9a0e","Type":"ContainerDied","Data":"6099ba2225afc605e432fc08f46d189f6b95b425f091591fe2c4aad0d5e20131"} Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.237979 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-164f-account-create-7fnv7" event={"ID":"98b7aae3-9795-47d6-89db-2e4f91af9a0e","Type":"ContainerStarted","Data":"a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61"} Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.241356 4755 generic.go:334] "Generic (PLEG): container finished" podID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerID="1633e3c5c5ebfc34e508071c9f8e1f1237359e4c454fd67af3224492420f4fbd" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.241446 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerDied","Data":"1633e3c5c5ebfc34e508071c9f8e1f1237359e4c454fd67af3224492420f4fbd"} Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.245049 4755 generic.go:334] "Generic (PLEG): container finished" podID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerID="16177d3511ed44688be5cb711444e8d032e2e2c914042e75ecd13cca11fbce6d" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.245159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerDied","Data":"16177d3511ed44688be5cb711444e8d032e2e2c914042e75ecd13cca11fbce6d"} Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.382476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2-config-66pd2"] Oct 06 08:38:08 crc kubenswrapper[4755]: I1006 08:38:08.440459 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fh9p5"] Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.251045 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fh9p5" event={"ID":"97f025db-474e-4629-96e7-2ebbd9413fc4","Type":"ContainerStarted","Data":"ff8456bf5214bc06a79426935a7eab535145560df8c07084a74438b28dcd6059"} Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.253375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerStarted","Data":"038237b0e55a1bd0d8c875ac304255eecfd4658e5d098bb3afe46936d263c699"} Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.254394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.256971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerStarted","Data":"4e783bc16a71a21f85e7a0d4edbec9cf161b9d673784b7dc6aecaaaccbaf42fd"} Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.257170 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.258767 4755 generic.go:334] "Generic (PLEG): container finished" podID="2691ac33-9f65-4cdf-9fc6-a537afffe6e0" containerID="fe5d094a36cfb13b7145e44b7df53b7f187e3f71e3b534f5fa8aabea3a9361b2" exitCode=0 Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.258817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-66pd2" event={"ID":"2691ac33-9f65-4cdf-9fc6-a537afffe6e0","Type":"ContainerDied","Data":"fe5d094a36cfb13b7145e44b7df53b7f187e3f71e3b534f5fa8aabea3a9361b2"} Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.258851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-66pd2" event={"ID":"2691ac33-9f65-4cdf-9fc6-a537afffe6e0","Type":"ContainerStarted","Data":"0adfb1112b272a80f61516bdcc1ce7a83e6f130bfddd2f0d1d950e24698f6eb3"} Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.287352 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.28047225 podStartE2EDuration="57.287329518s" podCreationTimestamp="2025-10-06 08:37:12 +0000 UTC" firstStartedPulling="2025-10-06 08:37:25.431884814 +0000 UTC m=+902.261200028" lastFinishedPulling="2025-10-06 08:37:33.438742082 +0000 UTC m=+910.268057296" observedRunningTime="2025-10-06 08:38:09.277216232 +0000 UTC m=+946.106531456" watchObservedRunningTime="2025-10-06 08:38:09.287329518 +0000 UTC m=+946.116644732" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.318577 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.265118694 podStartE2EDuration="57.318551198s" podCreationTimestamp="2025-10-06 08:37:12 +0000 UTC" firstStartedPulling="2025-10-06 08:37:25.647752911 +0000 UTC m=+902.477068125" lastFinishedPulling="2025-10-06 08:37:33.701185415 +0000 UTC m=+910.530500629" observedRunningTime="2025-10-06 08:38:09.317119423 +0000 UTC m=+946.146434637" watchObservedRunningTime="2025-10-06 08:38:09.318551198 +0000 UTC m=+946.147866412" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.600146 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.676382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9bsc\" (UniqueName: \"kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc\") pod \"98b7aae3-9795-47d6-89db-2e4f91af9a0e\" (UID: \"98b7aae3-9795-47d6-89db-2e4f91af9a0e\") " Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.685172 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc" (OuterVolumeSpecName: "kube-api-access-v9bsc") pod "98b7aae3-9795-47d6-89db-2e4f91af9a0e" (UID: "98b7aae3-9795-47d6-89db-2e4f91af9a0e"). InnerVolumeSpecName "kube-api-access-v9bsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4755]: I1006 08:38:09.779464 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9bsc\" (UniqueName: \"kubernetes.io/projected/98b7aae3-9795-47d6-89db-2e4f91af9a0e-kube-api-access-v9bsc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.271691 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-164f-account-create-7fnv7" event={"ID":"98b7aae3-9795-47d6-89db-2e4f91af9a0e","Type":"ContainerDied","Data":"a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61"} Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.271746 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e86399e41637439efc8a0c58dfcdcbbb747dd2410c5aaefe4eb1c0935a2a61" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.271826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-164f-account-create-7fnv7" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.608525 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695476 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695580 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run" (OuterVolumeSpecName: "var-run") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695596 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2kzf\" (UniqueName: \"kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695678 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.695698 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts\") pod \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\" (UID: \"2691ac33-9f65-4cdf-9fc6-a537afffe6e0\") " Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.696048 4755 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.696063 4755 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.696071 4755 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.696677 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.696770 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts" (OuterVolumeSpecName: "scripts") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.702337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf" (OuterVolumeSpecName: "kube-api-access-r2kzf") pod "2691ac33-9f65-4cdf-9fc6-a537afffe6e0" (UID: "2691ac33-9f65-4cdf-9fc6-a537afffe6e0"). InnerVolumeSpecName "kube-api-access-r2kzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.797963 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2kzf\" (UniqueName: \"kubernetes.io/projected/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-kube-api-access-r2kzf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.797997 4755 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4755]: I1006 08:38:10.798011 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2691ac33-9f65-4cdf-9fc6-a537afffe6e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.280706 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-66pd2" event={"ID":"2691ac33-9f65-4cdf-9fc6-a537afffe6e0","Type":"ContainerDied","Data":"0adfb1112b272a80f61516bdcc1ce7a83e6f130bfddd2f0d1d950e24698f6eb3"} Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.280748 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0adfb1112b272a80f61516bdcc1ce7a83e6f130bfddd2f0d1d950e24698f6eb3" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.280744 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-66pd2" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.709376 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b4rd2-config-66pd2"] Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.717095 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b4rd2-config-66pd2"] Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.823262 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b4rd2-config-xrlt9"] Oct 06 08:38:11 crc kubenswrapper[4755]: E1006 08:38:11.823579 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b7aae3-9795-47d6-89db-2e4f91af9a0e" containerName="mariadb-account-create" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.823596 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b7aae3-9795-47d6-89db-2e4f91af9a0e" containerName="mariadb-account-create" Oct 06 08:38:11 crc kubenswrapper[4755]: E1006 08:38:11.823637 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2691ac33-9f65-4cdf-9fc6-a537afffe6e0" containerName="ovn-config" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.823644 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2691ac33-9f65-4cdf-9fc6-a537afffe6e0" containerName="ovn-config" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.823783 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b7aae3-9795-47d6-89db-2e4f91af9a0e" containerName="mariadb-account-create" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.823793 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2691ac33-9f65-4cdf-9fc6-a537afffe6e0" containerName="ovn-config" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.824300 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.827063 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.845665 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2-config-xrlt9"] Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.887961 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2691ac33-9f65-4cdf-9fc6-a537afffe6e0" path="/var/lib/kubelet/pods/2691ac33-9f65-4cdf-9fc6-a537afffe6e0/volumes" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.913933 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzssc\" (UniqueName: \"kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.913980 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.914041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.914195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.914242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:11 crc kubenswrapper[4755]: I1006 08:38:11.914283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzssc\" (UniqueName: \"kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015531 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015615 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015665 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.015715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.016772 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.016853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.016860 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.017228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.018869 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.035105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzssc\" (UniqueName: \"kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc\") pod \"ovn-controller-b4rd2-config-xrlt9\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.140370 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.356019 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-b4rd2" Oct 06 08:38:12 crc kubenswrapper[4755]: I1006 08:38:12.644743 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b4rd2-config-xrlt9"] Oct 06 08:38:13 crc kubenswrapper[4755]: I1006 08:38:13.315925 4755 generic.go:334] "Generic (PLEG): container finished" podID="87df5ed4-9997-42d8-a21c-a15247d3b3f2" containerID="b31eaffd9f888a582bc731bff8351ea0ad7cd7b9a7e939e7eea6609331b77fba" exitCode=0 Oct 06 08:38:13 crc kubenswrapper[4755]: I1006 08:38:13.316024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-xrlt9" event={"ID":"87df5ed4-9997-42d8-a21c-a15247d3b3f2","Type":"ContainerDied","Data":"b31eaffd9f888a582bc731bff8351ea0ad7cd7b9a7e939e7eea6609331b77fba"} Oct 06 08:38:13 crc kubenswrapper[4755]: I1006 08:38:13.316206 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-xrlt9" event={"ID":"87df5ed4-9997-42d8-a21c-a15247d3b3f2","Type":"ContainerStarted","Data":"ca11dd7f4db87ebbee40620e90beb11ca15af4dd57a1c64432908ca4e5133333"} Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.371236 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b4rd2-config-xrlt9" event={"ID":"87df5ed4-9997-42d8-a21c-a15247d3b3f2","Type":"ContainerDied","Data":"ca11dd7f4db87ebbee40620e90beb11ca15af4dd57a1c64432908ca4e5133333"} Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.372133 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca11dd7f4db87ebbee40620e90beb11ca15af4dd57a1c64432908ca4e5133333" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.420841 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556029 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556300 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556805 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzssc\" (UniqueName: \"kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn\") pod \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\" (UID: \"87df5ed4-9997-42d8-a21c-a15247d3b3f2\") " Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.556953 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.557257 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.557306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run" (OuterVolumeSpecName: "var-run") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.557948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts" (OuterVolumeSpecName: "scripts") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.558828 4755 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.558868 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.558905 4755 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.558924 4755 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/87df5ed4-9997-42d8-a21c-a15247d3b3f2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.558943 4755 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/87df5ed4-9997-42d8-a21c-a15247d3b3f2-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.561711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc" (OuterVolumeSpecName: "kube-api-access-dzssc") pod "87df5ed4-9997-42d8-a21c-a15247d3b3f2" (UID: "87df5ed4-9997-42d8-a21c-a15247d3b3f2"). InnerVolumeSpecName "kube-api-access-dzssc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:20 crc kubenswrapper[4755]: I1006 08:38:20.660676 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzssc\" (UniqueName: \"kubernetes.io/projected/87df5ed4-9997-42d8-a21c-a15247d3b3f2-kube-api-access-dzssc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.379819 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b4rd2-config-xrlt9" Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.379822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fh9p5" event={"ID":"97f025db-474e-4629-96e7-2ebbd9413fc4","Type":"ContainerStarted","Data":"0bca671031cb203de217a5b6c7ecd4e21457f99bff27da7127a181d9a545765f"} Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.401512 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fh9p5" podStartSLOduration=2.410397295 podStartE2EDuration="14.401494301s" podCreationTimestamp="2025-10-06 08:38:07 +0000 UTC" firstStartedPulling="2025-10-06 08:38:08.452666628 +0000 UTC m=+945.281981842" lastFinishedPulling="2025-10-06 08:38:20.443763594 +0000 UTC m=+957.273078848" observedRunningTime="2025-10-06 08:38:21.394610714 +0000 UTC m=+958.223925948" watchObservedRunningTime="2025-10-06 08:38:21.401494301 +0000 UTC m=+958.230809525" Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.503844 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b4rd2-config-xrlt9"] Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.510554 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b4rd2-config-xrlt9"] Oct 06 08:38:21 crc kubenswrapper[4755]: I1006 08:38:21.890593 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87df5ed4-9997-42d8-a21c-a15247d3b3f2" path="/var/lib/kubelet/pods/87df5ed4-9997-42d8-a21c-a15247d3b3f2/volumes" Oct 06 08:38:23 crc kubenswrapper[4755]: I1006 08:38:23.940902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.184811 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.297510 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bqjxk"] Oct 06 08:38:24 crc kubenswrapper[4755]: E1006 08:38:24.299335 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87df5ed4-9997-42d8-a21c-a15247d3b3f2" containerName="ovn-config" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.299361 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87df5ed4-9997-42d8-a21c-a15247d3b3f2" containerName="ovn-config" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.299579 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87df5ed4-9997-42d8-a21c-a15247d3b3f2" containerName="ovn-config" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.300222 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.305449 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bqjxk"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.381927 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bxksl"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.387645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.393098 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bxksl"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.454581 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwdv6\" (UniqueName: \"kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6\") pod \"barbican-db-create-bxksl\" (UID: \"1e07be36-828e-457f-aa2e-091536b43617\") " pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.454666 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxj4\" (UniqueName: \"kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4\") pod \"cinder-db-create-bqjxk\" (UID: \"3b72f6f8-209e-492d-87af-03810abce3bd\") " pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.527987 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qwklr"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.529250 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.537057 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.537291 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.537415 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlwvt" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.537626 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.542092 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qwklr"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.556180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwdv6\" (UniqueName: \"kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6\") pod \"barbican-db-create-bxksl\" (UID: \"1e07be36-828e-457f-aa2e-091536b43617\") " pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.556274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxj4\" (UniqueName: \"kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4\") pod \"cinder-db-create-bqjxk\" (UID: \"3b72f6f8-209e-492d-87af-03810abce3bd\") " pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.579807 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fr5h2"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.580123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxj4\" (UniqueName: \"kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4\") pod \"cinder-db-create-bqjxk\" (UID: \"3b72f6f8-209e-492d-87af-03810abce3bd\") " pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.580941 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.589054 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fr5h2"] Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.603704 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwdv6\" (UniqueName: \"kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6\") pod \"barbican-db-create-bxksl\" (UID: \"1e07be36-828e-457f-aa2e-091536b43617\") " pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.625951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.657916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7stv\" (UniqueName: \"kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.658154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngcb\" (UniqueName: \"kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb\") pod \"neutron-db-create-fr5h2\" (UID: \"5227182b-b51d-46ee-a837-44eb07a36637\") " pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.658176 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.658214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.707511 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.759421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7stv\" (UniqueName: \"kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.759460 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xngcb\" (UniqueName: \"kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb\") pod \"neutron-db-create-fr5h2\" (UID: \"5227182b-b51d-46ee-a837-44eb07a36637\") " pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.759480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.759514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.763925 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.768517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.776457 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7stv\" (UniqueName: \"kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv\") pod \"keystone-db-sync-qwklr\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.779123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngcb\" (UniqueName: \"kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb\") pod \"neutron-db-create-fr5h2\" (UID: \"5227182b-b51d-46ee-a837-44eb07a36637\") " pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:24 crc kubenswrapper[4755]: I1006 08:38:24.986267 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:24.999304 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.059227 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bqjxk"] Oct 06 08:38:25 crc kubenswrapper[4755]: W1006 08:38:25.061446 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b72f6f8_209e_492d_87af_03810abce3bd.slice/crio-c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b WatchSource:0}: Error finding container c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b: Status 404 returned error can't find the container with id c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.190215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bxksl"] Oct 06 08:38:25 crc kubenswrapper[4755]: W1006 08:38:25.196862 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e07be36_828e_457f_aa2e_091536b43617.slice/crio-db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93 WatchSource:0}: Error finding container db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93: Status 404 returned error can't find the container with id db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93 Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.413215 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e07be36-828e-457f-aa2e-091536b43617" containerID="41b32dd76f015a9bca1dc4959b99265335c32fe13cccba0971249a9ebae11103" exitCode=0 Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.413333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxksl" event={"ID":"1e07be36-828e-457f-aa2e-091536b43617","Type":"ContainerDied","Data":"41b32dd76f015a9bca1dc4959b99265335c32fe13cccba0971249a9ebae11103"} Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.413637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxksl" event={"ID":"1e07be36-828e-457f-aa2e-091536b43617","Type":"ContainerStarted","Data":"db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93"} Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.414793 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b72f6f8-209e-492d-87af-03810abce3bd" containerID="8080ac391ff55c171c520aef4b278f7c42fae02f302c38f582405d245bb858e1" exitCode=0 Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.414830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqjxk" event={"ID":"3b72f6f8-209e-492d-87af-03810abce3bd","Type":"ContainerDied","Data":"8080ac391ff55c171c520aef4b278f7c42fae02f302c38f582405d245bb858e1"} Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.414858 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqjxk" event={"ID":"3b72f6f8-209e-492d-87af-03810abce3bd","Type":"ContainerStarted","Data":"c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b"} Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.446169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fr5h2"] Oct 06 08:38:25 crc kubenswrapper[4755]: W1006 08:38:25.461337 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5227182b_b51d_46ee_a837_44eb07a36637.slice/crio-48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d WatchSource:0}: Error finding container 48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d: Status 404 returned error can't find the container with id 48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d Oct 06 08:38:25 crc kubenswrapper[4755]: I1006 08:38:25.507479 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qwklr"] Oct 06 08:38:25 crc kubenswrapper[4755]: W1006 08:38:25.529403 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ceb537e_0d92_47ba_8cf4_470c3caa3765.slice/crio-884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d WatchSource:0}: Error finding container 884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d: Status 404 returned error can't find the container with id 884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.437067 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qwklr" event={"ID":"8ceb537e-0d92-47ba-8cf4-470c3caa3765","Type":"ContainerStarted","Data":"884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d"} Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.444994 4755 generic.go:334] "Generic (PLEG): container finished" podID="5227182b-b51d-46ee-a837-44eb07a36637" containerID="5ef92c5f3594f985c476edb365b34fba9702a3e539294e14d9e2ea700a89c348" exitCode=0 Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.445195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fr5h2" event={"ID":"5227182b-b51d-46ee-a837-44eb07a36637","Type":"ContainerDied","Data":"5ef92c5f3594f985c476edb365b34fba9702a3e539294e14d9e2ea700a89c348"} Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.445225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fr5h2" event={"ID":"5227182b-b51d-46ee-a837-44eb07a36637","Type":"ContainerStarted","Data":"48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d"} Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.841862 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.850172 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.891685 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwdv6\" (UniqueName: \"kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6\") pod \"1e07be36-828e-457f-aa2e-091536b43617\" (UID: \"1e07be36-828e-457f-aa2e-091536b43617\") " Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.892021 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjxj4\" (UniqueName: \"kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4\") pod \"3b72f6f8-209e-492d-87af-03810abce3bd\" (UID: \"3b72f6f8-209e-492d-87af-03810abce3bd\") " Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.897766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6" (OuterVolumeSpecName: "kube-api-access-cwdv6") pod "1e07be36-828e-457f-aa2e-091536b43617" (UID: "1e07be36-828e-457f-aa2e-091536b43617"). InnerVolumeSpecName "kube-api-access-cwdv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.923197 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4" (OuterVolumeSpecName: "kube-api-access-pjxj4") pod "3b72f6f8-209e-492d-87af-03810abce3bd" (UID: "3b72f6f8-209e-492d-87af-03810abce3bd"). InnerVolumeSpecName "kube-api-access-pjxj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.994285 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwdv6\" (UniqueName: \"kubernetes.io/projected/1e07be36-828e-457f-aa2e-091536b43617-kube-api-access-cwdv6\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:26 crc kubenswrapper[4755]: I1006 08:38:26.994321 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjxj4\" (UniqueName: \"kubernetes.io/projected/3b72f6f8-209e-492d-87af-03810abce3bd-kube-api-access-pjxj4\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.455242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxksl" event={"ID":"1e07be36-828e-457f-aa2e-091536b43617","Type":"ContainerDied","Data":"db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93"} Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.455484 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db7e7710b87a6a5e8d611f964950598222e7a70ab3e00a9c649a816b79017f93" Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.455554 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxksl" Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.458721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqjxk" event={"ID":"3b72f6f8-209e-492d-87af-03810abce3bd","Type":"ContainerDied","Data":"c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b"} Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.458771 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7de8908b5c71ad3da623046c666fdb103359a1b14fcd735ea747d79697fe95b" Oct 06 08:38:27 crc kubenswrapper[4755]: I1006 08:38:27.458881 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqjxk" Oct 06 08:38:29 crc kubenswrapper[4755]: I1006 08:38:29.480531 4755 generic.go:334] "Generic (PLEG): container finished" podID="97f025db-474e-4629-96e7-2ebbd9413fc4" containerID="0bca671031cb203de217a5b6c7ecd4e21457f99bff27da7127a181d9a545765f" exitCode=0 Oct 06 08:38:29 crc kubenswrapper[4755]: I1006 08:38:29.480636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fh9p5" event={"ID":"97f025db-474e-4629-96e7-2ebbd9413fc4","Type":"ContainerDied","Data":"0bca671031cb203de217a5b6c7ecd4e21457f99bff27da7127a181d9a545765f"} Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.321688 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.459191 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xngcb\" (UniqueName: \"kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb\") pod \"5227182b-b51d-46ee-a837-44eb07a36637\" (UID: \"5227182b-b51d-46ee-a837-44eb07a36637\") " Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.463487 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb" (OuterVolumeSpecName: "kube-api-access-xngcb") pod "5227182b-b51d-46ee-a837-44eb07a36637" (UID: "5227182b-b51d-46ee-a837-44eb07a36637"). InnerVolumeSpecName "kube-api-access-xngcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.491329 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qwklr" event={"ID":"8ceb537e-0d92-47ba-8cf4-470c3caa3765","Type":"ContainerStarted","Data":"07b93c917a8e8693bdbd74d3bcc9162c39421cdc741329e94d73d2d049f5c90f"} Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.494007 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fr5h2" event={"ID":"5227182b-b51d-46ee-a837-44eb07a36637","Type":"ContainerDied","Data":"48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d"} Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.494063 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fr5h2" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.494069 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c56f1d6804de13e75cca86db3d08235ff7267cb74a02fdbc016fdae794441d" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.521928 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qwklr" podStartSLOduration=1.8577956119999999 podStartE2EDuration="6.521911026s" podCreationTimestamp="2025-10-06 08:38:24 +0000 UTC" firstStartedPulling="2025-10-06 08:38:25.532221303 +0000 UTC m=+962.361536507" lastFinishedPulling="2025-10-06 08:38:30.196336697 +0000 UTC m=+967.025651921" observedRunningTime="2025-10-06 08:38:30.518054962 +0000 UTC m=+967.347370206" watchObservedRunningTime="2025-10-06 08:38:30.521911026 +0000 UTC m=+967.351226240" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.562231 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xngcb\" (UniqueName: \"kubernetes.io/projected/5227182b-b51d-46ee-a837-44eb07a36637-kube-api-access-xngcb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.807416 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.866075 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmqhg\" (UniqueName: \"kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg\") pod \"97f025db-474e-4629-96e7-2ebbd9413fc4\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.866140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data\") pod \"97f025db-474e-4629-96e7-2ebbd9413fc4\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.866181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle\") pod \"97f025db-474e-4629-96e7-2ebbd9413fc4\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.866368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data\") pod \"97f025db-474e-4629-96e7-2ebbd9413fc4\" (UID: \"97f025db-474e-4629-96e7-2ebbd9413fc4\") " Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.870694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "97f025db-474e-4629-96e7-2ebbd9413fc4" (UID: "97f025db-474e-4629-96e7-2ebbd9413fc4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.871096 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg" (OuterVolumeSpecName: "kube-api-access-lmqhg") pod "97f025db-474e-4629-96e7-2ebbd9413fc4" (UID: "97f025db-474e-4629-96e7-2ebbd9413fc4"). InnerVolumeSpecName "kube-api-access-lmqhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.895966 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97f025db-474e-4629-96e7-2ebbd9413fc4" (UID: "97f025db-474e-4629-96e7-2ebbd9413fc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.907184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data" (OuterVolumeSpecName: "config-data") pod "97f025db-474e-4629-96e7-2ebbd9413fc4" (UID: "97f025db-474e-4629-96e7-2ebbd9413fc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.968021 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.968052 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmqhg\" (UniqueName: \"kubernetes.io/projected/97f025db-474e-4629-96e7-2ebbd9413fc4-kube-api-access-lmqhg\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.968066 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:30 crc kubenswrapper[4755]: I1006 08:38:30.968075 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f025db-474e-4629-96e7-2ebbd9413fc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.502381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fh9p5" event={"ID":"97f025db-474e-4629-96e7-2ebbd9413fc4","Type":"ContainerDied","Data":"ff8456bf5214bc06a79426935a7eab535145560df8c07084a74438b28dcd6059"} Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.504108 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff8456bf5214bc06a79426935a7eab535145560df8c07084a74438b28dcd6059" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.502411 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fh9p5" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.869183 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:31 crc kubenswrapper[4755]: E1006 08:38:31.869784 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f025db-474e-4629-96e7-2ebbd9413fc4" containerName="glance-db-sync" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.869796 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f025db-474e-4629-96e7-2ebbd9413fc4" containerName="glance-db-sync" Oct 06 08:38:31 crc kubenswrapper[4755]: E1006 08:38:31.869810 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e07be36-828e-457f-aa2e-091536b43617" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.869815 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e07be36-828e-457f-aa2e-091536b43617" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: E1006 08:38:31.869837 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b72f6f8-209e-492d-87af-03810abce3bd" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.869846 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b72f6f8-209e-492d-87af-03810abce3bd" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: E1006 08:38:31.869864 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5227182b-b51d-46ee-a837-44eb07a36637" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.869870 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5227182b-b51d-46ee-a837-44eb07a36637" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.870032 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b72f6f8-209e-492d-87af-03810abce3bd" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.870043 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5227182b-b51d-46ee-a837-44eb07a36637" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.870055 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e07be36-828e-457f-aa2e-091536b43617" containerName="mariadb-database-create" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.870067 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f025db-474e-4629-96e7-2ebbd9413fc4" containerName="glance-db-sync" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.871259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.902996 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.987640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.988102 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.988226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.988251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:31 crc kubenswrapper[4755]: I1006 08:38:31.988284 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfsw\" (UniqueName: \"kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.090187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.090241 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.090279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfsw\" (UniqueName: \"kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.090332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.090365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.091294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.091378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.091472 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.091743 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.110751 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfsw\" (UniqueName: \"kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw\") pod \"dnsmasq-dns-54f9b7b8d9-plzpq\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.194783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:32 crc kubenswrapper[4755]: I1006 08:38:32.631010 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:32 crc kubenswrapper[4755]: W1006 08:38:32.652523 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ca46c65_cbb8_496b_9639_05952a779e26.slice/crio-44e4d5e1b4ef87df54a5fcb37e3ac6fe9228d059d77ec0b5e81199c39031af5d WatchSource:0}: Error finding container 44e4d5e1b4ef87df54a5fcb37e3ac6fe9228d059d77ec0b5e81199c39031af5d: Status 404 returned error can't find the container with id 44e4d5e1b4ef87df54a5fcb37e3ac6fe9228d059d77ec0b5e81199c39031af5d Oct 06 08:38:33 crc kubenswrapper[4755]: I1006 08:38:33.521767 4755 generic.go:334] "Generic (PLEG): container finished" podID="7ca46c65-cbb8-496b-9639-05952a779e26" containerID="293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620" exitCode=0 Oct 06 08:38:33 crc kubenswrapper[4755]: I1006 08:38:33.521812 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" event={"ID":"7ca46c65-cbb8-496b-9639-05952a779e26","Type":"ContainerDied","Data":"293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620"} Oct 06 08:38:33 crc kubenswrapper[4755]: I1006 08:38:33.522152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" event={"ID":"7ca46c65-cbb8-496b-9639-05952a779e26","Type":"ContainerStarted","Data":"44e4d5e1b4ef87df54a5fcb37e3ac6fe9228d059d77ec0b5e81199c39031af5d"} Oct 06 08:38:33 crc kubenswrapper[4755]: I1006 08:38:33.528290 4755 generic.go:334] "Generic (PLEG): container finished" podID="8ceb537e-0d92-47ba-8cf4-470c3caa3765" containerID="07b93c917a8e8693bdbd74d3bcc9162c39421cdc741329e94d73d2d049f5c90f" exitCode=0 Oct 06 08:38:33 crc kubenswrapper[4755]: I1006 08:38:33.528887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qwklr" event={"ID":"8ceb537e-0d92-47ba-8cf4-470c3caa3765","Type":"ContainerDied","Data":"07b93c917a8e8693bdbd74d3bcc9162c39421cdc741329e94d73d2d049f5c90f"} Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.333675 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8bef-account-create-2qpz4"] Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.335010 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.337481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.343791 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8bef-account-create-2qpz4"] Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.429847 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9kgq\" (UniqueName: \"kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq\") pod \"cinder-8bef-account-create-2qpz4\" (UID: \"9265504b-7527-495e-86bb-6042cc6ddec7\") " pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.442520 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ef82-account-create-4v5fj"] Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.443768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.447313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.456435 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ef82-account-create-4v5fj"] Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.531312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9kgq\" (UniqueName: \"kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq\") pod \"cinder-8bef-account-create-2qpz4\" (UID: \"9265504b-7527-495e-86bb-6042cc6ddec7\") " pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.531438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsj2\" (UniqueName: \"kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2\") pod \"barbican-ef82-account-create-4v5fj\" (UID: \"1c44bae0-2561-4bfd-9a0d-0f5130838f9c\") " pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.540822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" event={"ID":"7ca46c65-cbb8-496b-9639-05952a779e26","Type":"ContainerStarted","Data":"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a"} Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.540912 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.553083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9kgq\" (UniqueName: \"kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq\") pod \"cinder-8bef-account-create-2qpz4\" (UID: \"9265504b-7527-495e-86bb-6042cc6ddec7\") " pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.563727 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" podStartSLOduration=3.563477337 podStartE2EDuration="3.563477337s" podCreationTimestamp="2025-10-06 08:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:34.557300776 +0000 UTC m=+971.386616010" watchObservedRunningTime="2025-10-06 08:38:34.563477337 +0000 UTC m=+971.392792551" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.632904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsj2\" (UniqueName: \"kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2\") pod \"barbican-ef82-account-create-4v5fj\" (UID: \"1c44bae0-2561-4bfd-9a0d-0f5130838f9c\") " pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.654483 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.670995 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsj2\" (UniqueName: \"kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2\") pod \"barbican-ef82-account-create-4v5fj\" (UID: \"1c44bae0-2561-4bfd-9a0d-0f5130838f9c\") " pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.766037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.842944 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.936915 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7stv\" (UniqueName: \"kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv\") pod \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.937455 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data\") pod \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.937649 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle\") pod \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\" (UID: \"8ceb537e-0d92-47ba-8cf4-470c3caa3765\") " Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.941482 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv" (OuterVolumeSpecName: "kube-api-access-h7stv") pod "8ceb537e-0d92-47ba-8cf4-470c3caa3765" (UID: "8ceb537e-0d92-47ba-8cf4-470c3caa3765"). InnerVolumeSpecName "kube-api-access-h7stv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.968000 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ceb537e-0d92-47ba-8cf4-470c3caa3765" (UID: "8ceb537e-0d92-47ba-8cf4-470c3caa3765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:34 crc kubenswrapper[4755]: I1006 08:38:34.983934 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data" (OuterVolumeSpecName: "config-data") pod "8ceb537e-0d92-47ba-8cf4-470c3caa3765" (UID: "8ceb537e-0d92-47ba-8cf4-470c3caa3765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.040159 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.040194 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ceb537e-0d92-47ba-8cf4-470c3caa3765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.040232 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7stv\" (UniqueName: \"kubernetes.io/projected/8ceb537e-0d92-47ba-8cf4-470c3caa3765-kube-api-access-h7stv\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.183124 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8bef-account-create-2qpz4"] Oct 06 08:38:35 crc kubenswrapper[4755]: W1006 08:38:35.183907 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9265504b_7527_495e_86bb_6042cc6ddec7.slice/crio-32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4 WatchSource:0}: Error finding container 32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4: Status 404 returned error can't find the container with id 32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4 Oct 06 08:38:35 crc kubenswrapper[4755]: W1006 08:38:35.258076 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c44bae0_2561_4bfd_9a0d_0f5130838f9c.slice/crio-acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375 WatchSource:0}: Error finding container acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375: Status 404 returned error can't find the container with id acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375 Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.259677 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ef82-account-create-4v5fj"] Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.548660 4755 generic.go:334] "Generic (PLEG): container finished" podID="1c44bae0-2561-4bfd-9a0d-0f5130838f9c" containerID="1fbc26523512b85dd55613bbac44bccfff876e7d095b758502899c48aed5d694" exitCode=0 Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.548770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ef82-account-create-4v5fj" event={"ID":"1c44bae0-2561-4bfd-9a0d-0f5130838f9c","Type":"ContainerDied","Data":"1fbc26523512b85dd55613bbac44bccfff876e7d095b758502899c48aed5d694"} Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.550270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ef82-account-create-4v5fj" event={"ID":"1c44bae0-2561-4bfd-9a0d-0f5130838f9c","Type":"ContainerStarted","Data":"acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375"} Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.562915 4755 generic.go:334] "Generic (PLEG): container finished" podID="9265504b-7527-495e-86bb-6042cc6ddec7" containerID="ef3094cd8d71e856b999d3329d9b79ae34f3cc0b0ce5d51134fb7b1e3e422508" exitCode=0 Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.563001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8bef-account-create-2qpz4" event={"ID":"9265504b-7527-495e-86bb-6042cc6ddec7","Type":"ContainerDied","Data":"ef3094cd8d71e856b999d3329d9b79ae34f3cc0b0ce5d51134fb7b1e3e422508"} Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.563029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8bef-account-create-2qpz4" event={"ID":"9265504b-7527-495e-86bb-6042cc6ddec7","Type":"ContainerStarted","Data":"32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4"} Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.565050 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qwklr" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.565705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qwklr" event={"ID":"8ceb537e-0d92-47ba-8cf4-470c3caa3765","Type":"ContainerDied","Data":"884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d"} Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.565746 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884420ad00664988bf1b43088bc9d6faff47398b2a7c5ca95e01add430c5de2d" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.822829 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bxscf"] Oct 06 08:38:35 crc kubenswrapper[4755]: E1006 08:38:35.823269 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceb537e-0d92-47ba-8cf4-470c3caa3765" containerName="keystone-db-sync" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.823293 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceb537e-0d92-47ba-8cf4-470c3caa3765" containerName="keystone-db-sync" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.823519 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ceb537e-0d92-47ba-8cf4-470c3caa3765" containerName="keystone-db-sync" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.824294 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.829271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlwvt" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.829466 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.829624 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bxscf"] Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.836997 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.838371 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.838710 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.901644 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.903461 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.905963 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955413 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45cvq\" (UniqueName: \"kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955537 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955691 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.955730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:35 crc kubenswrapper[4755]: I1006 08:38:35.988118 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.005591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.031103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.035186 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.053701 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061162 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061217 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061262 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061320 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061387 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061416 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061434 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061462 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061487 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45cvq\" (UniqueName: \"kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rg9\" (UniqueName: \"kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.061719 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.071537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.077351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.078143 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.087199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.087414 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.132476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45cvq\" (UniqueName: \"kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq\") pod \"keystone-bootstrap-bxscf\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.143682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rg9\" (UniqueName: \"kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164551 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164827 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.164932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.166581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.171099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.172175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.172487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.173112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.173313 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.182605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.194300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.197221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.215816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f\") pod \"dnsmasq-dns-6546db6db7-8jxkp\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.216570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.223740 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.233595 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rg9\" (UniqueName: \"kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9\") pod \"ceilometer-0\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.261627 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fmkjk"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.273246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.280934 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.281116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mpb9p" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.281659 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.293645 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.328612 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fmkjk"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.344707 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.347181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.355276 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.378338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.378395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.378423 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.378736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlc7\" (UniqueName: \"kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.378835 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.388832 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482511 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482835 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482961 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlc7\" (UniqueName: \"kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.482996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.483046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfr4\" (UniqueName: \"kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.483134 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.483173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.484839 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.494124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.498429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.501012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.509164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlc7\" (UniqueName: \"kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7\") pod \"placement-db-sync-fmkjk\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.572158 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="dnsmasq-dns" containerID="cri-o://9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a" gracePeriod=10 Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.584753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.584813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.584841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfr4\" (UniqueName: \"kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.584908 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.584946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.585715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.587070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.587823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.587996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.607543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfr4\" (UniqueName: \"kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4\") pod \"dnsmasq-dns-7987f74bbc-lxx5f\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.670002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.718053 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:36 crc kubenswrapper[4755]: I1006 08:38:36.836059 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bxscf"] Oct 06 08:38:36 crc kubenswrapper[4755]: W1006 08:38:36.867492 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31400554_3153_4f60_aa57_7dcc462d4018.slice/crio-575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc WatchSource:0}: Error finding container 575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc: Status 404 returned error can't find the container with id 575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.153371 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.158850 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.191394 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.225044 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.231412 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.297839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb\") pod \"7ca46c65-cbb8-496b-9639-05952a779e26\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.297908 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvfsw\" (UniqueName: \"kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw\") pod \"7ca46c65-cbb8-496b-9639-05952a779e26\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.297934 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb\") pod \"7ca46c65-cbb8-496b-9639-05952a779e26\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.297959 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9kgq\" (UniqueName: \"kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq\") pod \"9265504b-7527-495e-86bb-6042cc6ddec7\" (UID: \"9265504b-7527-495e-86bb-6042cc6ddec7\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.298058 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc\") pod \"7ca46c65-cbb8-496b-9639-05952a779e26\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.298122 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsj2\" (UniqueName: \"kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2\") pod \"1c44bae0-2561-4bfd-9a0d-0f5130838f9c\" (UID: \"1c44bae0-2561-4bfd-9a0d-0f5130838f9c\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.298162 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config\") pod \"7ca46c65-cbb8-496b-9639-05952a779e26\" (UID: \"7ca46c65-cbb8-496b-9639-05952a779e26\") " Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.303456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2" (OuterVolumeSpecName: "kube-api-access-lbsj2") pod "1c44bae0-2561-4bfd-9a0d-0f5130838f9c" (UID: "1c44bae0-2561-4bfd-9a0d-0f5130838f9c"). InnerVolumeSpecName "kube-api-access-lbsj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.305524 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw" (OuterVolumeSpecName: "kube-api-access-kvfsw") pod "7ca46c65-cbb8-496b-9639-05952a779e26" (UID: "7ca46c65-cbb8-496b-9639-05952a779e26"). InnerVolumeSpecName "kube-api-access-kvfsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.305948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq" (OuterVolumeSpecName: "kube-api-access-g9kgq") pod "9265504b-7527-495e-86bb-6042cc6ddec7" (UID: "9265504b-7527-495e-86bb-6042cc6ddec7"). InnerVolumeSpecName "kube-api-access-g9kgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.358003 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ca46c65-cbb8-496b-9639-05952a779e26" (UID: "7ca46c65-cbb8-496b-9639-05952a779e26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.362264 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config" (OuterVolumeSpecName: "config") pod "7ca46c65-cbb8-496b-9639-05952a779e26" (UID: "7ca46c65-cbb8-496b-9639-05952a779e26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.370553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ca46c65-cbb8-496b-9639-05952a779e26" (UID: "7ca46c65-cbb8-496b-9639-05952a779e26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.393685 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ca46c65-cbb8-496b-9639-05952a779e26" (UID: "7ca46c65-cbb8-496b-9639-05952a779e26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.396748 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399741 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399768 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbsj2\" (UniqueName: \"kubernetes.io/projected/1c44bae0-2561-4bfd-9a0d-0f5130838f9c-kube-api-access-lbsj2\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399778 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399787 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399796 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvfsw\" (UniqueName: \"kubernetes.io/projected/7ca46c65-cbb8-496b-9639-05952a779e26-kube-api-access-kvfsw\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399804 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca46c65-cbb8-496b-9639-05952a779e26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.399814 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9kgq\" (UniqueName: \"kubernetes.io/projected/9265504b-7527-495e-86bb-6042cc6ddec7-kube-api-access-g9kgq\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4755]: W1006 08:38:37.408461 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffad361d_03f7_4ed8_938c_013349c3eab0.slice/crio-c72cf6323afcab861a193f63e66f7dec0a67e746a4516335473424385c10c219 WatchSource:0}: Error finding container c72cf6323afcab861a193f63e66f7dec0a67e746a4516335473424385c10c219: Status 404 returned error can't find the container with id c72cf6323afcab861a193f63e66f7dec0a67e746a4516335473424385c10c219 Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.414620 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fmkjk"] Oct 06 08:38:37 crc kubenswrapper[4755]: W1006 08:38:37.419025 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5daa56_27db_42d7_9f80_cb230c855299.slice/crio-21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b WatchSource:0}: Error finding container 21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b: Status 404 returned error can't find the container with id 21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.588429 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8bef-account-create-2qpz4" event={"ID":"9265504b-7527-495e-86bb-6042cc6ddec7","Type":"ContainerDied","Data":"32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.588831 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d735436aead6362b2b98b8bec64d1d112a14100e47f4175fab5852e3ee02a4" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.588897 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8bef-account-create-2qpz4" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.596626 4755 generic.go:334] "Generic (PLEG): container finished" podID="c15438db-5fb5-4c27-88f8-74bcf28d9283" containerID="de69e9c74c55a726641db1ef88bf77343736a30325af2dc6c813c85599697c1b" exitCode=0 Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.596913 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" event={"ID":"c15438db-5fb5-4c27-88f8-74bcf28d9283","Type":"ContainerDied","Data":"de69e9c74c55a726641db1ef88bf77343736a30325af2dc6c813c85599697c1b"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.596943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" event={"ID":"c15438db-5fb5-4c27-88f8-74bcf28d9283","Type":"ContainerStarted","Data":"f23d65f526cab92ed3f5fc8f9f03c074b755784bf26f14aaefa0358368cc7347"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.602335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerStarted","Data":"a0399d7de5c9d99b56aab33dd0843ba233ec2c1514e39c1986498491dabb5107"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.609486 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ef82-account-create-4v5fj" event={"ID":"1c44bae0-2561-4bfd-9a0d-0f5130838f9c","Type":"ContainerDied","Data":"acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.609550 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd8d79300bfe04ba7c34ebf2f55a34c58d4224b90be0f0f75aa5b9acb7e2375" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.609671 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ef82-account-create-4v5fj" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.631091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bxscf" event={"ID":"31400554-3153-4f60-aa57-7dcc462d4018","Type":"ContainerStarted","Data":"6f10384233b1d78bf64b174774e2195e2fcef63c60bb6643ffdd5b7665911c6b"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.631168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bxscf" event={"ID":"31400554-3153-4f60-aa57-7dcc462d4018","Type":"ContainerStarted","Data":"575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.639385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmkjk" event={"ID":"ce5daa56-27db-42d7-9f80-cb230c855299","Type":"ContainerStarted","Data":"21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.659467 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bxscf" podStartSLOduration=2.6594470980000002 podStartE2EDuration="2.659447098s" podCreationTimestamp="2025-10-06 08:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:37.656167364 +0000 UTC m=+974.485482588" watchObservedRunningTime="2025-10-06 08:38:37.659447098 +0000 UTC m=+974.488762312" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.660816 4755 generic.go:334] "Generic (PLEG): container finished" podID="7ca46c65-cbb8-496b-9639-05952a779e26" containerID="9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a" exitCode=0 Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.660941 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" event={"ID":"7ca46c65-cbb8-496b-9639-05952a779e26","Type":"ContainerDied","Data":"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.660975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" event={"ID":"7ca46c65-cbb8-496b-9639-05952a779e26","Type":"ContainerDied","Data":"44e4d5e1b4ef87df54a5fcb37e3ac6fe9228d059d77ec0b5e81199c39031af5d"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.660997 4755 scope.go:117] "RemoveContainer" containerID="9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.661174 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-plzpq" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.671811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" event={"ID":"ffad361d-03f7-4ed8-938c-013349c3eab0","Type":"ContainerStarted","Data":"c72cf6323afcab861a193f63e66f7dec0a67e746a4516335473424385c10c219"} Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.719999 4755 scope.go:117] "RemoveContainer" containerID="293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.746920 4755 scope.go:117] "RemoveContainer" containerID="9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a" Oct 06 08:38:37 crc kubenswrapper[4755]: E1006 08:38:37.748267 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a\": container with ID starting with 9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a not found: ID does not exist" containerID="9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.748319 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a"} err="failed to get container status \"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a\": rpc error: code = NotFound desc = could not find container \"9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a\": container with ID starting with 9bd96c3e41ffea82bdee85f2297492f4ee400ed814f5cc0fbfd8a253cd98462a not found: ID does not exist" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.748358 4755 scope.go:117] "RemoveContainer" containerID="293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620" Oct 06 08:38:37 crc kubenswrapper[4755]: E1006 08:38:37.755356 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620\": container with ID starting with 293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620 not found: ID does not exist" containerID="293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.755407 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620"} err="failed to get container status \"293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620\": rpc error: code = NotFound desc = could not find container \"293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620\": container with ID starting with 293a2615ad64f6ae0725ea7f9804d735c854c1e061d6e7fb28ab1999a68f6620 not found: ID does not exist" Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.768368 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.778666 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-plzpq"] Oct 06 08:38:37 crc kubenswrapper[4755]: I1006 08:38:37.901708 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" path="/var/lib/kubelet/pods/7ca46c65-cbb8-496b-9639-05952a779e26/volumes" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.055389 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.111093 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.122628 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb\") pod \"c15438db-5fb5-4c27-88f8-74bcf28d9283\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.122711 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb\") pod \"c15438db-5fb5-4c27-88f8-74bcf28d9283\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.122742 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc\") pod \"c15438db-5fb5-4c27-88f8-74bcf28d9283\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.122763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config\") pod \"c15438db-5fb5-4c27-88f8-74bcf28d9283\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.122913 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f\") pod \"c15438db-5fb5-4c27-88f8-74bcf28d9283\" (UID: \"c15438db-5fb5-4c27-88f8-74bcf28d9283\") " Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.129739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f" (OuterVolumeSpecName: "kube-api-access-9597f") pod "c15438db-5fb5-4c27-88f8-74bcf28d9283" (UID: "c15438db-5fb5-4c27-88f8-74bcf28d9283"). InnerVolumeSpecName "kube-api-access-9597f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.147463 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c15438db-5fb5-4c27-88f8-74bcf28d9283" (UID: "c15438db-5fb5-4c27-88f8-74bcf28d9283"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.150527 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c15438db-5fb5-4c27-88f8-74bcf28d9283" (UID: "c15438db-5fb5-4c27-88f8-74bcf28d9283"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.161892 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config" (OuterVolumeSpecName: "config") pod "c15438db-5fb5-4c27-88f8-74bcf28d9283" (UID: "c15438db-5fb5-4c27-88f8-74bcf28d9283"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.167408 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c15438db-5fb5-4c27-88f8-74bcf28d9283" (UID: "c15438db-5fb5-4c27-88f8-74bcf28d9283"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.225745 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.225815 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.226031 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.226052 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/c15438db-5fb5-4c27-88f8-74bcf28d9283-kube-api-access-9597f\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.226066 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c15438db-5fb5-4c27-88f8-74bcf28d9283-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.685388 4755 generic.go:334] "Generic (PLEG): container finished" podID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerID="2cf34babb62f405eaf9271a4ca4b5c9a82266efbd9c98fdb3b4e828e84917f64" exitCode=0 Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.685761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" event={"ID":"ffad361d-03f7-4ed8-938c-013349c3eab0","Type":"ContainerDied","Data":"2cf34babb62f405eaf9271a4ca4b5c9a82266efbd9c98fdb3b4e828e84917f64"} Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.695884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" event={"ID":"c15438db-5fb5-4c27-88f8-74bcf28d9283","Type":"ContainerDied","Data":"f23d65f526cab92ed3f5fc8f9f03c074b755784bf26f14aaefa0358368cc7347"} Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.696195 4755 scope.go:117] "RemoveContainer" containerID="de69e9c74c55a726641db1ef88bf77343736a30325af2dc6c813c85599697c1b" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.696382 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-8jxkp" Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.926910 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:38 crc kubenswrapper[4755]: I1006 08:38:38.934486 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-8jxkp"] Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.492432 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-c6wg6"] Oct 06 08:38:39 crc kubenswrapper[4755]: E1006 08:38:39.493014 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9265504b-7527-495e-86bb-6042cc6ddec7" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493033 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9265504b-7527-495e-86bb-6042cc6ddec7" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: E1006 08:38:39.493062 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15438db-5fb5-4c27-88f8-74bcf28d9283" containerName="init" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493070 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15438db-5fb5-4c27-88f8-74bcf28d9283" containerName="init" Oct 06 08:38:39 crc kubenswrapper[4755]: E1006 08:38:39.493079 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="dnsmasq-dns" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493089 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="dnsmasq-dns" Oct 06 08:38:39 crc kubenswrapper[4755]: E1006 08:38:39.493095 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c44bae0-2561-4bfd-9a0d-0f5130838f9c" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493104 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c44bae0-2561-4bfd-9a0d-0f5130838f9c" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: E1006 08:38:39.493126 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="init" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493134 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="init" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493324 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9265504b-7527-495e-86bb-6042cc6ddec7" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493349 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15438db-5fb5-4c27-88f8-74bcf28d9283" containerName="init" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493372 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca46c65-cbb8-496b-9639-05952a779e26" containerName="dnsmasq-dns" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493384 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c44bae0-2561-4bfd-9a0d-0f5130838f9c" containerName="mariadb-account-create" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.493982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.496955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9fgk5" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.497012 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.507855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c6wg6"] Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.512075 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.553245 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.554204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.554304 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.554401 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrfk\" (UniqueName: \"kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.554555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.554684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656748 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrfk\" (UniqueName: \"kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.656883 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.663411 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.663476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.669531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.672769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.673785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrfk\" (UniqueName: \"kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk\") pod \"cinder-db-sync-c6wg6\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.740316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" event={"ID":"ffad361d-03f7-4ed8-938c-013349c3eab0","Type":"ContainerStarted","Data":"7d3fc1e742754cda9a279a87c4ac907df4f637805d58ce84f78e865585e478ae"} Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.740766 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.770606 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rxxsl"] Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.772356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.781326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.782380 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6tppc" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.788623 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rxxsl"] Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.790708 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" podStartSLOduration=3.7906783490000002 podStartE2EDuration="3.790678349s" podCreationTimestamp="2025-10-06 08:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:39.768343432 +0000 UTC m=+976.597658646" watchObservedRunningTime="2025-10-06 08:38:39.790678349 +0000 UTC m=+976.619993563" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.812464 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.859085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.859168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.859241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2hj9\" (UniqueName: \"kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.895809 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15438db-5fb5-4c27-88f8-74bcf28d9283" path="/var/lib/kubelet/pods/c15438db-5fb5-4c27-88f8-74bcf28d9283/volumes" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.961316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.961412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.961476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2hj9\" (UniqueName: \"kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.964992 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.972343 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:39 crc kubenswrapper[4755]: I1006 08:38:39.984023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2hj9\" (UniqueName: \"kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9\") pod \"barbican-db-sync-rxxsl\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:40 crc kubenswrapper[4755]: I1006 08:38:40.102479 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:38:40 crc kubenswrapper[4755]: I1006 08:38:40.759294 4755 generic.go:334] "Generic (PLEG): container finished" podID="31400554-3153-4f60-aa57-7dcc462d4018" containerID="6f10384233b1d78bf64b174774e2195e2fcef63c60bb6643ffdd5b7665911c6b" exitCode=0 Oct 06 08:38:40 crc kubenswrapper[4755]: I1006 08:38:40.759689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bxscf" event={"ID":"31400554-3153-4f60-aa57-7dcc462d4018","Type":"ContainerDied","Data":"6f10384233b1d78bf64b174774e2195e2fcef63c60bb6643ffdd5b7665911c6b"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.192932 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.347607 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.347971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.347991 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.348016 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.348128 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45cvq\" (UniqueName: \"kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.348176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys\") pod \"31400554-3153-4f60-aa57-7dcc462d4018\" (UID: \"31400554-3153-4f60-aa57-7dcc462d4018\") " Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.355778 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.355870 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq" (OuterVolumeSpecName: "kube-api-access-45cvq") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "kube-api-access-45cvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.356732 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts" (OuterVolumeSpecName: "scripts") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.359239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.381381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.383381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data" (OuterVolumeSpecName: "config-data") pod "31400554-3153-4f60-aa57-7dcc462d4018" (UID: "31400554-3153-4f60-aa57-7dcc462d4018"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451798 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45cvq\" (UniqueName: \"kubernetes.io/projected/31400554-3153-4f60-aa57-7dcc462d4018-kube-api-access-45cvq\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451842 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451855 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451868 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451880 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.451891 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31400554-3153-4f60-aa57-7dcc462d4018-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.608476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c6wg6"] Oct 06 08:38:43 crc kubenswrapper[4755]: W1006 08:38:43.620501 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9755bfc9_d53e_4848_8d4b_04fdef46a4ea.slice/crio-54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca WatchSource:0}: Error finding container 54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca: Status 404 returned error can't find the container with id 54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.687492 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rxxsl"] Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.787881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rxxsl" event={"ID":"b2d1c5df-48c1-4df7-9b04-c19e9510168a","Type":"ContainerStarted","Data":"ccf0ac220bf41880bde6305152ff803a3162e0fa8e5da2908e53921a09262982"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.790133 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerStarted","Data":"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.792522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bxscf" event={"ID":"31400554-3153-4f60-aa57-7dcc462d4018","Type":"ContainerDied","Data":"575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.792548 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575abc724d9d6e18e1f25efe950b8b34df7c122bfc624f45cee278b80b34d9fc" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.792555 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bxscf" Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.794480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmkjk" event={"ID":"ce5daa56-27db-42d7-9f80-cb230c855299","Type":"ContainerStarted","Data":"c6c16b1460f709c0c887bf57cf3edf06a0fb001a6edc33d3179ca7308720e9e2"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.796377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c6wg6" event={"ID":"9755bfc9-d53e-4848-8d4b-04fdef46a4ea","Type":"ContainerStarted","Data":"54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca"} Oct 06 08:38:43 crc kubenswrapper[4755]: I1006 08:38:43.814292 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fmkjk" podStartSLOduration=2.147370972 podStartE2EDuration="7.814276954s" podCreationTimestamp="2025-10-06 08:38:36 +0000 UTC" firstStartedPulling="2025-10-06 08:38:37.421482922 +0000 UTC m=+974.250798136" lastFinishedPulling="2025-10-06 08:38:43.088388904 +0000 UTC m=+979.917704118" observedRunningTime="2025-10-06 08:38:43.812887649 +0000 UTC m=+980.642202863" watchObservedRunningTime="2025-10-06 08:38:43.814276954 +0000 UTC m=+980.643592168" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.338811 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bxscf"] Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.350777 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bxscf"] Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.455640 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-95kzq"] Oct 06 08:38:44 crc kubenswrapper[4755]: E1006 08:38:44.456059 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31400554-3153-4f60-aa57-7dcc462d4018" containerName="keystone-bootstrap" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.456077 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="31400554-3153-4f60-aa57-7dcc462d4018" containerName="keystone-bootstrap" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.456315 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="31400554-3153-4f60-aa57-7dcc462d4018" containerName="keystone-bootstrap" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.456973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.459219 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.459424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.459555 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlwvt" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.459584 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.462465 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-95kzq"] Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.570837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.571069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.571161 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.571211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.571336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lx2\" (UniqueName: \"kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.571392 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.634978 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-73af-account-create-cfnhn"] Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.636510 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.641092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.644734 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73af-account-create-cfnhn"] Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.675748 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.675804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.675885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lx2\" (UniqueName: \"kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.675952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.677282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.677544 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.682012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.682155 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.683066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.684248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.702726 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.702775 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lx2\" (UniqueName: \"kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2\") pod \"keystone-bootstrap-95kzq\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.779042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhwv5\" (UniqueName: \"kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5\") pod \"neutron-73af-account-create-cfnhn\" (UID: \"370fc33c-40d0-42b7-9822-a44512c4d881\") " pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.802585 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.881833 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhwv5\" (UniqueName: \"kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5\") pod \"neutron-73af-account-create-cfnhn\" (UID: \"370fc33c-40d0-42b7-9822-a44512c4d881\") " pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.906256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhwv5\" (UniqueName: \"kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5\") pod \"neutron-73af-account-create-cfnhn\" (UID: \"370fc33c-40d0-42b7-9822-a44512c4d881\") " pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:44 crc kubenswrapper[4755]: I1006 08:38:44.955997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.270902 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-95kzq"] Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.454010 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73af-account-create-cfnhn"] Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.815989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-95kzq" event={"ID":"d420f11b-9596-4bbf-9a4c-c13e39020db9","Type":"ContainerStarted","Data":"5fd8320b76815c6a615e0b5aec5fd5060e9162980330e15da428dbf614ae81f5"} Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.816421 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-95kzq" event={"ID":"d420f11b-9596-4bbf-9a4c-c13e39020db9","Type":"ContainerStarted","Data":"a12ef66069dbb708b9fb5f94d56128d45f21fcaaac5487f1ad6693c790b4520e"} Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.817484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73af-account-create-cfnhn" event={"ID":"370fc33c-40d0-42b7-9822-a44512c4d881","Type":"ContainerStarted","Data":"17498905c8cc91cfcb90d39b6d6e969e206ab8d24c8e7d3eb7f04bbf3ca2efaa"} Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.854477 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-95kzq" podStartSLOduration=1.854453066 podStartE2EDuration="1.854453066s" podCreationTimestamp="2025-10-06 08:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:45.842605915 +0000 UTC m=+982.671921149" watchObservedRunningTime="2025-10-06 08:38:45.854453066 +0000 UTC m=+982.683768290" Oct 06 08:38:45 crc kubenswrapper[4755]: I1006 08:38:45.889264 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31400554-3153-4f60-aa57-7dcc462d4018" path="/var/lib/kubelet/pods/31400554-3153-4f60-aa57-7dcc462d4018/volumes" Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.720273 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.866124 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce5daa56-27db-42d7-9f80-cb230c855299" containerID="c6c16b1460f709c0c887bf57cf3edf06a0fb001a6edc33d3179ca7308720e9e2" exitCode=0 Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.866219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmkjk" event={"ID":"ce5daa56-27db-42d7-9f80-cb230c855299","Type":"ContainerDied","Data":"c6c16b1460f709c0c887bf57cf3edf06a0fb001a6edc33d3179ca7308720e9e2"} Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.871685 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.872732 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" containerID="cri-o://6474e483a747d46e7481bb383840eec31fc58ad2ead94f9e53a095811837a5a3" gracePeriod=10 Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.873173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerStarted","Data":"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce"} Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.889773 4755 generic.go:334] "Generic (PLEG): container finished" podID="370fc33c-40d0-42b7-9822-a44512c4d881" containerID="7a7cce3628c1484c9173dd39feb5725d6bd6cf8b44a4216de400932d6e57b963" exitCode=0 Oct 06 08:38:46 crc kubenswrapper[4755]: I1006 08:38:46.889905 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73af-account-create-cfnhn" event={"ID":"370fc33c-40d0-42b7-9822-a44512c4d881","Type":"ContainerDied","Data":"7a7cce3628c1484c9173dd39feb5725d6bd6cf8b44a4216de400932d6e57b963"} Oct 06 08:38:47 crc kubenswrapper[4755]: I1006 08:38:47.745313 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 06 08:38:47 crc kubenswrapper[4755]: I1006 08:38:47.901699 4755 generic.go:334] "Generic (PLEG): container finished" podID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerID="6474e483a747d46e7481bb383840eec31fc58ad2ead94f9e53a095811837a5a3" exitCode=0 Oct 06 08:38:47 crc kubenswrapper[4755]: I1006 08:38:47.901755 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" event={"ID":"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b","Type":"ContainerDied","Data":"6474e483a747d46e7481bb383840eec31fc58ad2ead94f9e53a095811837a5a3"} Oct 06 08:38:48 crc kubenswrapper[4755]: I1006 08:38:48.912464 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:38:48 crc kubenswrapper[4755]: I1006 08:38:48.913051 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.446639 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.506124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle\") pod \"ce5daa56-27db-42d7-9f80-cb230c855299\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.506234 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data\") pod \"ce5daa56-27db-42d7-9f80-cb230c855299\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.506360 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts\") pod \"ce5daa56-27db-42d7-9f80-cb230c855299\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.506392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs\") pod \"ce5daa56-27db-42d7-9f80-cb230c855299\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.506452 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wlc7\" (UniqueName: \"kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7\") pod \"ce5daa56-27db-42d7-9f80-cb230c855299\" (UID: \"ce5daa56-27db-42d7-9f80-cb230c855299\") " Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.507460 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs" (OuterVolumeSpecName: "logs") pod "ce5daa56-27db-42d7-9f80-cb230c855299" (UID: "ce5daa56-27db-42d7-9f80-cb230c855299"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.516877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts" (OuterVolumeSpecName: "scripts") pod "ce5daa56-27db-42d7-9f80-cb230c855299" (UID: "ce5daa56-27db-42d7-9f80-cb230c855299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.523127 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7" (OuterVolumeSpecName: "kube-api-access-2wlc7") pod "ce5daa56-27db-42d7-9f80-cb230c855299" (UID: "ce5daa56-27db-42d7-9f80-cb230c855299"). InnerVolumeSpecName "kube-api-access-2wlc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.537352 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data" (OuterVolumeSpecName: "config-data") pod "ce5daa56-27db-42d7-9f80-cb230c855299" (UID: "ce5daa56-27db-42d7-9f80-cb230c855299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.538250 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce5daa56-27db-42d7-9f80-cb230c855299" (UID: "ce5daa56-27db-42d7-9f80-cb230c855299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.623896 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.625428 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.627905 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce5daa56-27db-42d7-9f80-cb230c855299-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.628007 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wlc7\" (UniqueName: \"kubernetes.io/projected/ce5daa56-27db-42d7-9f80-cb230c855299-kube-api-access-2wlc7\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.628106 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5daa56-27db-42d7-9f80-cb230c855299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.949171 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmkjk" event={"ID":"ce5daa56-27db-42d7-9f80-cb230c855299","Type":"ContainerDied","Data":"21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b"} Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.949589 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a4909172dd358ce35ce1bc636cf4d53095014a5c9bc80c94cea4d27f3a905b" Oct 06 08:38:50 crc kubenswrapper[4755]: I1006 08:38:50.949673 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmkjk" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.573375 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7646c5cd7b-lvntf"] Oct 06 08:38:51 crc kubenswrapper[4755]: E1006 08:38:51.574861 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5daa56-27db-42d7-9f80-cb230c855299" containerName="placement-db-sync" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.574885 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5daa56-27db-42d7-9f80-cb230c855299" containerName="placement-db-sync" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.575117 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5daa56-27db-42d7-9f80-cb230c855299" containerName="placement-db-sync" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.588636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.595174 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.595381 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.595495 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.595626 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.595776 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mpb9p" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.598281 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7646c5cd7b-lvntf"] Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.619439 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.651976 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30667b83-ed3b-414b-af66-45b97ac252c1-logs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjgr\" (UniqueName: \"kubernetes.io/projected/30667b83-ed3b-414b-af66-45b97ac252c1-kube-api-access-tdjgr\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652384 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-scripts\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-internal-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652488 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-combined-ca-bundle\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-config-data\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.652609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-public-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.753096 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhwv5\" (UniqueName: \"kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5\") pod \"370fc33c-40d0-42b7-9822-a44512c4d881\" (UID: \"370fc33c-40d0-42b7-9822-a44512c4d881\") " Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.753707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30667b83-ed3b-414b-af66-45b97ac252c1-logs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.753745 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjgr\" (UniqueName: \"kubernetes.io/projected/30667b83-ed3b-414b-af66-45b97ac252c1-kube-api-access-tdjgr\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754250 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-scripts\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754271 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-internal-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-combined-ca-bundle\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-config-data\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754603 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-public-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.754686 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30667b83-ed3b-414b-af66-45b97ac252c1-logs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.759545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-scripts\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.759798 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5" (OuterVolumeSpecName: "kube-api-access-vhwv5") pod "370fc33c-40d0-42b7-9822-a44512c4d881" (UID: "370fc33c-40d0-42b7-9822-a44512c4d881"). InnerVolumeSpecName "kube-api-access-vhwv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.769161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-public-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.771404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-config-data\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.771735 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjgr\" (UniqueName: \"kubernetes.io/projected/30667b83-ed3b-414b-af66-45b97ac252c1-kube-api-access-tdjgr\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.785603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-internal-tls-certs\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.786700 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30667b83-ed3b-414b-af66-45b97ac252c1-combined-ca-bundle\") pod \"placement-7646c5cd7b-lvntf\" (UID: \"30667b83-ed3b-414b-af66-45b97ac252c1\") " pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.855915 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhwv5\" (UniqueName: \"kubernetes.io/projected/370fc33c-40d0-42b7-9822-a44512c4d881-kube-api-access-vhwv5\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.930156 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.962363 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73af-account-create-cfnhn" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.962389 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73af-account-create-cfnhn" event={"ID":"370fc33c-40d0-42b7-9822-a44512c4d881","Type":"ContainerDied","Data":"17498905c8cc91cfcb90d39b6d6e969e206ab8d24c8e7d3eb7f04bbf3ca2efaa"} Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.962463 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17498905c8cc91cfcb90d39b6d6e969e206ab8d24c8e7d3eb7f04bbf3ca2efaa" Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.965084 4755 generic.go:334] "Generic (PLEG): container finished" podID="d420f11b-9596-4bbf-9a4c-c13e39020db9" containerID="5fd8320b76815c6a615e0b5aec5fd5060e9162980330e15da428dbf614ae81f5" exitCode=0 Oct 06 08:38:51 crc kubenswrapper[4755]: I1006 08:38:51.965231 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-95kzq" event={"ID":"d420f11b-9596-4bbf-9a4c-c13e39020db9","Type":"ContainerDied","Data":"5fd8320b76815c6a615e0b5aec5fd5060e9162980330e15da428dbf614ae81f5"} Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.771686 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dsb4x"] Oct 06 08:38:54 crc kubenswrapper[4755]: E1006 08:38:54.772483 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370fc33c-40d0-42b7-9822-a44512c4d881" containerName="mariadb-account-create" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.772498 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="370fc33c-40d0-42b7-9822-a44512c4d881" containerName="mariadb-account-create" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.772722 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="370fc33c-40d0-42b7-9822-a44512c4d881" containerName="mariadb-account-create" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.773252 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.775419 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c9flv" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.775478 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.775673 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.791061 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dsb4x"] Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.959400 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.959506 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:54 crc kubenswrapper[4755]: I1006 08:38:54.959537 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg86g\" (UniqueName: \"kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.061429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.061484 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg86g\" (UniqueName: \"kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.061620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.068300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.074904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.078219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg86g\" (UniqueName: \"kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g\") pod \"neutron-db-sync-dsb4x\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:55 crc kubenswrapper[4755]: I1006 08:38:55.091806 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:38:57 crc kubenswrapper[4755]: I1006 08:38:57.746775 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Oct 06 08:39:02 crc kubenswrapper[4755]: I1006 08:39:02.748071 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Oct 06 08:39:02 crc kubenswrapper[4755]: I1006 08:39:02.748771 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.062098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" event={"ID":"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b","Type":"ContainerDied","Data":"0cdc1b48cdace607d3602009e4879941b71b53c4d86c31fc5025fde3d8212e39"} Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.062143 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cdc1b48cdace607d3602009e4879941b71b53c4d86c31fc5025fde3d8212e39" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.063965 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-95kzq" event={"ID":"d420f11b-9596-4bbf-9a4c-c13e39020db9","Type":"ContainerDied","Data":"a12ef66069dbb708b9fb5f94d56128d45f21fcaaac5487f1ad6693c790b4520e"} Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.064080 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a12ef66069dbb708b9fb5f94d56128d45f21fcaaac5487f1ad6693c790b4520e" Oct 06 08:39:03 crc kubenswrapper[4755]: E1006 08:39:03.066556 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 06 08:39:03 crc kubenswrapper[4755]: E1006 08:39:03.066750 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flrfk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-c6wg6_openstack(9755bfc9-d53e-4848-8d4b-04fdef46a4ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:39:03 crc kubenswrapper[4755]: E1006 08:39:03.068187 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-c6wg6" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.125402 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.141997 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202265 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc\") pod \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202389 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config\") pod \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202658 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb\") pod \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202807 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb\") pod \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202868 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdnff\" (UniqueName: \"kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff\") pod \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\" (UID: \"3f7c4eac-5ffd-4df6-8951-4fdc85e0076b\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.202978 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6lx2\" (UniqueName: \"kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2\") pod \"d420f11b-9596-4bbf-9a4c-c13e39020db9\" (UID: \"d420f11b-9596-4bbf-9a4c-c13e39020db9\") " Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.212481 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.212650 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.213074 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2" (OuterVolumeSpecName: "kube-api-access-z6lx2") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "kube-api-access-z6lx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.213789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts" (OuterVolumeSpecName: "scripts") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.215320 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff" (OuterVolumeSpecName: "kube-api-access-sdnff") pod "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" (UID: "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b"). InnerVolumeSpecName "kube-api-access-sdnff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.234970 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.237825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data" (OuterVolumeSpecName: "config-data") pod "d420f11b-9596-4bbf-9a4c-c13e39020db9" (UID: "d420f11b-9596-4bbf-9a4c-c13e39020db9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.260612 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config" (OuterVolumeSpecName: "config") pod "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" (UID: "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.262028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" (UID: "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.278802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" (UID: "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.279214 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" (UID: "3f7c4eac-5ffd-4df6-8951-4fdc85e0076b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304759 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6lx2\" (UniqueName: \"kubernetes.io/projected/d420f11b-9596-4bbf-9a4c-c13e39020db9-kube-api-access-z6lx2\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304798 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304808 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304818 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304827 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304835 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304843 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304851 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304858 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304866 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d420f11b-9596-4bbf-9a4c-c13e39020db9-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4755]: I1006 08:39:03.304874 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdnff\" (UniqueName: \"kubernetes.io/projected/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b-kube-api-access-sdnff\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.073587 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.073607 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-95kzq" Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.076069 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-c6wg6" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.116348 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.122980 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bsqzt"] Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337165 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67f564d7bf-cx47l"] Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.337509 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="init" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337527 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="init" Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.337541 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d420f11b-9596-4bbf-9a4c-c13e39020db9" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337550 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d420f11b-9596-4bbf-9a4c-c13e39020db9" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.337578 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337586 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337804 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.337839 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d420f11b-9596-4bbf-9a4c-c13e39020db9" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.338479 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.342488 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.342786 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.342914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xlwvt" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.346116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.346327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.353998 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.361927 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67f564d7bf-cx47l"] Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-internal-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-combined-ca-bundle\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gz6w\" (UniqueName: \"kubernetes.io/projected/e5a3f24e-8de2-49da-aa12-9559bf1c0212-kube-api-access-2gz6w\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-public-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429786 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-credential-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-fernet-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-scripts\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.429947 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-config-data\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.518886 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 06 08:39:04 crc kubenswrapper[4755]: E1006 08:39:04.519050 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2rg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b1b65447-5db8-480f-a0d5-17a674f2c401): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-combined-ca-bundle\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gz6w\" (UniqueName: \"kubernetes.io/projected/e5a3f24e-8de2-49da-aa12-9559bf1c0212-kube-api-access-2gz6w\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-public-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-credential-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-fernet-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-scripts\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-config-data\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.531352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-internal-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.535671 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-combined-ca-bundle\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.541225 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-credential-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.541421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-internal-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.545152 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-scripts\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.546466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-fernet-keys\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.546759 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-public-tls-certs\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.553228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gz6w\" (UniqueName: \"kubernetes.io/projected/e5a3f24e-8de2-49da-aa12-9559bf1c0212-kube-api-access-2gz6w\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.559178 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3f24e-8de2-49da-aa12-9559bf1c0212-config-data\") pod \"keystone-67f564d7bf-cx47l\" (UID: \"e5a3f24e-8de2-49da-aa12-9559bf1c0212\") " pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.655358 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:04 crc kubenswrapper[4755]: I1006 08:39:04.973767 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dsb4x"] Oct 06 08:39:04 crc kubenswrapper[4755]: W1006 08:39:04.978836 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c6f5eb2_4ba0_4d5c_badd_a0ddb2da6f5c.slice/crio-75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea WatchSource:0}: Error finding container 75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea: Status 404 returned error can't find the container with id 75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.071186 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7646c5cd7b-lvntf"] Oct 06 08:39:05 crc kubenswrapper[4755]: W1006 08:39:05.075526 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30667b83_ed3b_414b_af66_45b97ac252c1.slice/crio-9e005ee08059a311eb0417ce7bfa9f32ee7a16a74e77d7052e4d269572f6c4a7 WatchSource:0}: Error finding container 9e005ee08059a311eb0417ce7bfa9f32ee7a16a74e77d7052e4d269572f6c4a7: Status 404 returned error can't find the container with id 9e005ee08059a311eb0417ce7bfa9f32ee7a16a74e77d7052e4d269572f6c4a7 Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.085899 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dsb4x" event={"ID":"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c","Type":"ContainerStarted","Data":"75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea"} Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.088327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rxxsl" event={"ID":"b2d1c5df-48c1-4df7-9b04-c19e9510168a","Type":"ContainerStarted","Data":"d9a7deb04f821c6bd0ff5f1901592f8f21a22dcd3b6b6bbbfedd54dc74916100"} Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.100642 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rxxsl" podStartSLOduration=5.287810403 podStartE2EDuration="26.100622629s" podCreationTimestamp="2025-10-06 08:38:39 +0000 UTC" firstStartedPulling="2025-10-06 08:38:43.690552045 +0000 UTC m=+980.519867259" lastFinishedPulling="2025-10-06 08:39:04.503364271 +0000 UTC m=+1001.332679485" observedRunningTime="2025-10-06 08:39:05.099292046 +0000 UTC m=+1001.928607260" watchObservedRunningTime="2025-10-06 08:39:05.100622629 +0000 UTC m=+1001.929937863" Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.167744 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67f564d7bf-cx47l"] Oct 06 08:39:05 crc kubenswrapper[4755]: W1006 08:39:05.167780 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a3f24e_8de2_49da_aa12_9559bf1c0212.slice/crio-f200170d3136c8e0f4de9acde0e5788b576159af6cf337d67240845a878e7d07 WatchSource:0}: Error finding container f200170d3136c8e0f4de9acde0e5788b576159af6cf337d67240845a878e7d07: Status 404 returned error can't find the container with id f200170d3136c8e0f4de9acde0e5788b576159af6cf337d67240845a878e7d07 Oct 06 08:39:05 crc kubenswrapper[4755]: I1006 08:39:05.906369 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" path="/var/lib/kubelet/pods/3f7c4eac-5ffd-4df6-8951-4fdc85e0076b/volumes" Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.098997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dsb4x" event={"ID":"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c","Type":"ContainerStarted","Data":"94c8ad26391b632e5f7b578218c32b8a57b58c1653ba4ed7c4c638eef2b92a24"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.101286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67f564d7bf-cx47l" event={"ID":"e5a3f24e-8de2-49da-aa12-9559bf1c0212","Type":"ContainerStarted","Data":"0f22c61e0531bc45067bb4657ccc070fea0bfa994d311aec9769353609eee9f1"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.101349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67f564d7bf-cx47l" event={"ID":"e5a3f24e-8de2-49da-aa12-9559bf1c0212","Type":"ContainerStarted","Data":"f200170d3136c8e0f4de9acde0e5788b576159af6cf337d67240845a878e7d07"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.101446 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.104037 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7646c5cd7b-lvntf" event={"ID":"30667b83-ed3b-414b-af66-45b97ac252c1","Type":"ContainerStarted","Data":"f18756c9dfcd4bb9357cb1477f8435e6b98777b4f10a30150122464a44deda53"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.104083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7646c5cd7b-lvntf" event={"ID":"30667b83-ed3b-414b-af66-45b97ac252c1","Type":"ContainerStarted","Data":"1fce900ebe9877a3eee0fb83a2cbd1439f9f5010996c08c762293b24bc01bf9f"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.104095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7646c5cd7b-lvntf" event={"ID":"30667b83-ed3b-414b-af66-45b97ac252c1","Type":"ContainerStarted","Data":"9e005ee08059a311eb0417ce7bfa9f32ee7a16a74e77d7052e4d269572f6c4a7"} Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.128042 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dsb4x" podStartSLOduration=12.128021345 podStartE2EDuration="12.128021345s" podCreationTimestamp="2025-10-06 08:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:06.114542494 +0000 UTC m=+1002.943857708" watchObservedRunningTime="2025-10-06 08:39:06.128021345 +0000 UTC m=+1002.957336559" Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.158691 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7646c5cd7b-lvntf" podStartSLOduration=15.158664032 podStartE2EDuration="15.158664032s" podCreationTimestamp="2025-10-06 08:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:06.151520591 +0000 UTC m=+1002.980835805" watchObservedRunningTime="2025-10-06 08:39:06.158664032 +0000 UTC m=+1002.987979266" Oct 06 08:39:06 crc kubenswrapper[4755]: I1006 08:39:06.162252 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-67f564d7bf-cx47l" podStartSLOduration=2.162237893 podStartE2EDuration="2.162237893s" podCreationTimestamp="2025-10-06 08:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:06.135544307 +0000 UTC m=+1002.964859521" watchObservedRunningTime="2025-10-06 08:39:06.162237893 +0000 UTC m=+1002.991553107" Oct 06 08:39:07 crc kubenswrapper[4755]: I1006 08:39:07.111733 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:39:07 crc kubenswrapper[4755]: I1006 08:39:07.111954 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:39:07 crc kubenswrapper[4755]: I1006 08:39:07.749808 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bsqzt" podUID="3f7c4eac-5ffd-4df6-8951-4fdc85e0076b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Oct 06 08:39:09 crc kubenswrapper[4755]: I1006 08:39:09.128112 4755 generic.go:334] "Generic (PLEG): container finished" podID="b2d1c5df-48c1-4df7-9b04-c19e9510168a" containerID="d9a7deb04f821c6bd0ff5f1901592f8f21a22dcd3b6b6bbbfedd54dc74916100" exitCode=0 Oct 06 08:39:09 crc kubenswrapper[4755]: I1006 08:39:09.128208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rxxsl" event={"ID":"b2d1c5df-48c1-4df7-9b04-c19e9510168a","Type":"ContainerDied","Data":"d9a7deb04f821c6bd0ff5f1901592f8f21a22dcd3b6b6bbbfedd54dc74916100"} Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.070249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.146401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rxxsl" event={"ID":"b2d1c5df-48c1-4df7-9b04-c19e9510168a","Type":"ContainerDied","Data":"ccf0ac220bf41880bde6305152ff803a3162e0fa8e5da2908e53921a09262982"} Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.146439 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf0ac220bf41880bde6305152ff803a3162e0fa8e5da2908e53921a09262982" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.146449 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rxxsl" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.150866 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle\") pod \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.151002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data\") pod \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.151085 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2hj9\" (UniqueName: \"kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9\") pod \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\" (UID: \"b2d1c5df-48c1-4df7-9b04-c19e9510168a\") " Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.160073 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9" (OuterVolumeSpecName: "kube-api-access-n2hj9") pod "b2d1c5df-48c1-4df7-9b04-c19e9510168a" (UID: "b2d1c5df-48c1-4df7-9b04-c19e9510168a"). InnerVolumeSpecName "kube-api-access-n2hj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.171770 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b2d1c5df-48c1-4df7-9b04-c19e9510168a" (UID: "b2d1c5df-48c1-4df7-9b04-c19e9510168a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.181527 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2d1c5df-48c1-4df7-9b04-c19e9510168a" (UID: "b2d1c5df-48c1-4df7-9b04-c19e9510168a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.253527 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.253594 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2d1c5df-48c1-4df7-9b04-c19e9510168a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.253606 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2hj9\" (UniqueName: \"kubernetes.io/projected/b2d1c5df-48c1-4df7-9b04-c19e9510168a-kube-api-access-n2hj9\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.393383 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c66599474-j7m6l"] Oct 06 08:39:11 crc kubenswrapper[4755]: E1006 08:39:11.397411 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d1c5df-48c1-4df7-9b04-c19e9510168a" containerName="barbican-db-sync" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.397444 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d1c5df-48c1-4df7-9b04-c19e9510168a" containerName="barbican-db-sync" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.400810 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d1c5df-48c1-4df7-9b04-c19e9510168a" containerName="barbican-db-sync" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.401982 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6589865cf-k6ldq"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.402107 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.406913 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.408823 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.411255 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.423851 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c66599474-j7m6l"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.438643 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6589865cf-k6ldq"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.505735 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.507132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.519782 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560270 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-combined-ca-bundle\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data-custom\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560443 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d155ddb9-1b21-4346-8858-2aba24321b8a-logs\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-combined-ca-bundle\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560786 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-logs\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560836 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjx2c\" (UniqueName: \"kubernetes.io/projected/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-kube-api-access-fjx2c\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.560961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv7qn\" (UniqueName: \"kubernetes.io/projected/d155ddb9-1b21-4346-8858-2aba24321b8a-kube-api-access-lv7qn\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.561011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data-custom\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.601257 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.602989 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.607015 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.626700 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.662859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-combined-ca-bundle\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.662910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.662937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-logs\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.662975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjx2c\" (UniqueName: \"kubernetes.io/projected/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-kube-api-access-fjx2c\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchhv\" (UniqueName: \"kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv7qn\" (UniqueName: \"kubernetes.io/projected/d155ddb9-1b21-4346-8858-2aba24321b8a-kube-api-access-lv7qn\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data-custom\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663244 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663299 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-combined-ca-bundle\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data-custom\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663395 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d155ddb9-1b21-4346-8858-2aba24321b8a-logs\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.663912 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d155ddb9-1b21-4346-8858-2aba24321b8a-logs\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.664200 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-logs\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.667369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-combined-ca-bundle\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.669983 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-combined-ca-bundle\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.671214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data-custom\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.671336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.677821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-config-data\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.680458 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv7qn\" (UniqueName: \"kubernetes.io/projected/d155ddb9-1b21-4346-8858-2aba24321b8a-kube-api-access-lv7qn\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.680609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d155ddb9-1b21-4346-8858-2aba24321b8a-config-data-custom\") pod \"barbican-keystone-listener-c66599474-j7m6l\" (UID: \"d155ddb9-1b21-4346-8858-2aba24321b8a\") " pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.680788 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjx2c\" (UniqueName: \"kubernetes.io/projected/c90c3ed0-a9cd-4589-ba3a-4c77e2163190-kube-api-access-fjx2c\") pod \"barbican-worker-6589865cf-k6ldq\" (UID: \"c90c3ed0-a9cd-4589-ba3a-4c77e2163190\") " pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.730439 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.750257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6589865cf-k6ldq" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764468 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchhv\" (UniqueName: \"kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6dt\" (UniqueName: \"kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764717 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764832 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764880 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.764922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.765418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.765650 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.765810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.766871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.781797 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchhv\" (UniqueName: \"kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv\") pod \"dnsmasq-dns-699df9757c-4qjn2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.834229 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.866969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6dt\" (UniqueName: \"kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.867030 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.867056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.867110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.867176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.867491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.870681 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.871295 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.873735 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.891721 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6dt\" (UniqueName: \"kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt\") pod \"barbican-api-76b8d6c486-gk8d2\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:11 crc kubenswrapper[4755]: I1006 08:39:11.944893 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:12 crc kubenswrapper[4755]: E1006 08:39:12.245313 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" Oct 06 08:39:12 crc kubenswrapper[4755]: I1006 08:39:12.392203 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c66599474-j7m6l"] Oct 06 08:39:12 crc kubenswrapper[4755]: W1006 08:39:12.401754 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd155ddb9_1b21_4346_8858_2aba24321b8a.slice/crio-8156f08ca9899b1d72ab1404861070e77bb4948608f2c1095d4b306233444968 WatchSource:0}: Error finding container 8156f08ca9899b1d72ab1404861070e77bb4948608f2c1095d4b306233444968: Status 404 returned error can't find the container with id 8156f08ca9899b1d72ab1404861070e77bb4948608f2c1095d4b306233444968 Oct 06 08:39:12 crc kubenswrapper[4755]: I1006 08:39:12.566219 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:12 crc kubenswrapper[4755]: W1006 08:39:12.584538 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90c3ed0_a9cd_4589_ba3a_4c77e2163190.slice/crio-13edb6d605c0188fd3e0ba00d2bc03268eb0db308b9cc0a1b9c58512ebb50a4c WatchSource:0}: Error finding container 13edb6d605c0188fd3e0ba00d2bc03268eb0db308b9cc0a1b9c58512ebb50a4c: Status 404 returned error can't find the container with id 13edb6d605c0188fd3e0ba00d2bc03268eb0db308b9cc0a1b9c58512ebb50a4c Oct 06 08:39:12 crc kubenswrapper[4755]: I1006 08:39:12.584577 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6589865cf-k6ldq"] Oct 06 08:39:12 crc kubenswrapper[4755]: I1006 08:39:12.591738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.167382 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" event={"ID":"d155ddb9-1b21-4346-8858-2aba24321b8a","Type":"ContainerStarted","Data":"8156f08ca9899b1d72ab1404861070e77bb4948608f2c1095d4b306233444968"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.170384 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerStarted","Data":"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.170495 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-central-agent" containerID="cri-o://24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275" gracePeriod=30 Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.170521 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.170512 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="proxy-httpd" containerID="cri-o://f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54" gracePeriod=30 Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.170554 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-notification-agent" containerID="cri-o://00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce" gracePeriod=30 Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.174744 4755 generic.go:334] "Generic (PLEG): container finished" podID="46024b0b-7959-469e-be3a-72570d94b1b2" containerID="fce724e0a30e9c89d584227dcc8aa4096438bbeb0999bdf17e4abf4093ce120f" exitCode=0 Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.174827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" event={"ID":"46024b0b-7959-469e-be3a-72570d94b1b2","Type":"ContainerDied","Data":"fce724e0a30e9c89d584227dcc8aa4096438bbeb0999bdf17e4abf4093ce120f"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.174854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" event={"ID":"46024b0b-7959-469e-be3a-72570d94b1b2","Type":"ContainerStarted","Data":"036afbb03ff4b1357a5e9c2ca57548ff975fcbe75fd7c8106d2bed3b1226e45c"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.189786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerStarted","Data":"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.189862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerStarted","Data":"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.189877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerStarted","Data":"1c5e44755e6fb9f5f0a66b616f3fb63dc30c09c1e18eb5924b024b93f83a35c5"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.190958 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.191046 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.210332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6589865cf-k6ldq" event={"ID":"c90c3ed0-a9cd-4589-ba3a-4c77e2163190","Type":"ContainerStarted","Data":"13edb6d605c0188fd3e0ba00d2bc03268eb0db308b9cc0a1b9c58512ebb50a4c"} Oct 06 08:39:13 crc kubenswrapper[4755]: I1006 08:39:13.278376 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76b8d6c486-gk8d2" podStartSLOduration=2.278352529 podStartE2EDuration="2.278352529s" podCreationTimestamp="2025-10-06 08:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:13.246145732 +0000 UTC m=+1010.075460956" watchObservedRunningTime="2025-10-06 08:39:13.278352529 +0000 UTC m=+1010.107667743" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.221898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" event={"ID":"46024b0b-7959-469e-be3a-72570d94b1b2","Type":"ContainerStarted","Data":"9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a"} Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.222468 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.228613 4755 generic.go:334] "Generic (PLEG): container finished" podID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerID="f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54" exitCode=0 Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.228647 4755 generic.go:334] "Generic (PLEG): container finished" podID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerID="24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275" exitCode=0 Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.229902 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerDied","Data":"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54"} Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.229930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerDied","Data":"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275"} Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.263195 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" podStartSLOduration=3.263168136 podStartE2EDuration="3.263168136s" podCreationTimestamp="2025-10-06 08:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:14.255684786 +0000 UTC m=+1011.084999990" watchObservedRunningTime="2025-10-06 08:39:14.263168136 +0000 UTC m=+1011.092483350" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.361471 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7567ddf88-bwsd4"] Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.363587 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.370552 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.375537 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7567ddf88-bwsd4"] Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.380173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.532734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-internal-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.532869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hn9w\" (UniqueName: \"kubernetes.io/projected/cd538b73-7f94-452c-a366-692369e490da-kube-api-access-4hn9w\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.532921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-public-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.532944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-combined-ca-bundle\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.533030 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.533065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd538b73-7f94-452c-a366-692369e490da-logs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.533130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data-custom\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hn9w\" (UniqueName: \"kubernetes.io/projected/cd538b73-7f94-452c-a366-692369e490da-kube-api-access-4hn9w\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-public-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-combined-ca-bundle\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634387 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd538b73-7f94-452c-a366-692369e490da-logs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634447 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data-custom\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.634477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-internal-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.635466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd538b73-7f94-452c-a366-692369e490da-logs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.638951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-internal-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.639604 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-combined-ca-bundle\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.640214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-public-tls-certs\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.645206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data-custom\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.645461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd538b73-7f94-452c-a366-692369e490da-config-data\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.651868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hn9w\" (UniqueName: \"kubernetes.io/projected/cd538b73-7f94-452c-a366-692369e490da-kube-api-access-4hn9w\") pod \"barbican-api-7567ddf88-bwsd4\" (UID: \"cd538b73-7f94-452c-a366-692369e490da\") " pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:14 crc kubenswrapper[4755]: I1006 08:39:14.812716 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.238287 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" event={"ID":"d155ddb9-1b21-4346-8858-2aba24321b8a","Type":"ContainerStarted","Data":"0e1e91bb39e9b535bf51beccca0768799218aa46d26a4e99bf281e42cc04681c"} Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.238685 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" event={"ID":"d155ddb9-1b21-4346-8858-2aba24321b8a","Type":"ContainerStarted","Data":"5acf889f2076c4e766f81500232d4bff250cbcea21a92fea96aa4946fbce2345"} Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.240614 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6589865cf-k6ldq" event={"ID":"c90c3ed0-a9cd-4589-ba3a-4c77e2163190","Type":"ContainerStarted","Data":"b587436746efa11cd6117e52dab8df5d17780e8930ae7e1530a0861c066fafa7"} Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.240688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6589865cf-k6ldq" event={"ID":"c90c3ed0-a9cd-4589-ba3a-4c77e2163190","Type":"ContainerStarted","Data":"729964477f74bbe2f7d33e2234c0e92acd1b59ffbc9a9143fd3c489029bc9a04"} Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.253866 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7567ddf88-bwsd4"] Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.277637 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c66599474-j7m6l" podStartSLOduration=2.48775314 podStartE2EDuration="4.277587763s" podCreationTimestamp="2025-10-06 08:39:11 +0000 UTC" firstStartedPulling="2025-10-06 08:39:12.403938183 +0000 UTC m=+1009.233253397" lastFinishedPulling="2025-10-06 08:39:14.193772806 +0000 UTC m=+1011.023088020" observedRunningTime="2025-10-06 08:39:15.276768213 +0000 UTC m=+1012.106083427" watchObservedRunningTime="2025-10-06 08:39:15.277587763 +0000 UTC m=+1012.106902997" Oct 06 08:39:15 crc kubenswrapper[4755]: I1006 08:39:15.306514 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6589865cf-k6ldq" podStartSLOduration=2.719116629 podStartE2EDuration="4.306479686s" podCreationTimestamp="2025-10-06 08:39:11 +0000 UTC" firstStartedPulling="2025-10-06 08:39:12.604085369 +0000 UTC m=+1009.433400583" lastFinishedPulling="2025-10-06 08:39:14.191448426 +0000 UTC m=+1011.020763640" observedRunningTime="2025-10-06 08:39:15.29521486 +0000 UTC m=+1012.124530074" watchObservedRunningTime="2025-10-06 08:39:15.306479686 +0000 UTC m=+1012.135794920" Oct 06 08:39:16 crc kubenswrapper[4755]: I1006 08:39:16.248948 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7567ddf88-bwsd4" event={"ID":"cd538b73-7f94-452c-a366-692369e490da","Type":"ContainerStarted","Data":"3123197f84feaf62bc7fcd635d620e9fc8c5f068fd26a794a7a26f812905e480"} Oct 06 08:39:16 crc kubenswrapper[4755]: I1006 08:39:16.249214 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7567ddf88-bwsd4" event={"ID":"cd538b73-7f94-452c-a366-692369e490da","Type":"ContainerStarted","Data":"015f18690d8ef21cc4ffedeeaeea15627068f07a0c3603ee29c82cc7b636421a"} Oct 06 08:39:16 crc kubenswrapper[4755]: I1006 08:39:16.249226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7567ddf88-bwsd4" event={"ID":"cd538b73-7f94-452c-a366-692369e490da","Type":"ContainerStarted","Data":"10b2d039db4b0e0053c40f0f56a8a20cb7091d491270bc258300031e955ae2b4"} Oct 06 08:39:16 crc kubenswrapper[4755]: I1006 08:39:16.273633 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7567ddf88-bwsd4" podStartSLOduration=2.273613304 podStartE2EDuration="2.273613304s" podCreationTimestamp="2025-10-06 08:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:16.267729434 +0000 UTC m=+1013.097044658" watchObservedRunningTime="2025-10-06 08:39:16.273613304 +0000 UTC m=+1013.102928518" Oct 06 08:39:17 crc kubenswrapper[4755]: I1006 08:39:17.258474 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:17 crc kubenswrapper[4755]: I1006 08:39:17.258845 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:18 crc kubenswrapper[4755]: E1006 08:39:18.723800 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1b65447_5db8_480f_a0d5_17a674f2c401.slice/crio-00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1b65447_5db8_480f_a0d5_17a674f2c401.slice/crio-conmon-00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:39:18 crc kubenswrapper[4755]: I1006 08:39:18.912853 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:39:18 crc kubenswrapper[4755]: I1006 08:39:18.912924 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.081106 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225149 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rg9\" (UniqueName: \"kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225216 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225270 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225312 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.225412 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data\") pod \"b1b65447-5db8-480f-a0d5-17a674f2c401\" (UID: \"b1b65447-5db8-480f-a0d5-17a674f2c401\") " Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.228111 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.228639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.233282 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9" (OuterVolumeSpecName: "kube-api-access-v2rg9") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "kube-api-access-v2rg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.235642 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.235819 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts" (OuterVolumeSpecName: "scripts") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.299271 4755 generic.go:334] "Generic (PLEG): container finished" podID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerID="00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce" exitCode=0 Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.299323 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerDied","Data":"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce"} Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.299361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1b65447-5db8-480f-a0d5-17a674f2c401","Type":"ContainerDied","Data":"a0399d7de5c9d99b56aab33dd0843ba233ec2c1514e39c1986498491dabb5107"} Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.299385 4755 scope.go:117] "RemoveContainer" containerID="f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.299393 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.318484 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330727 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330764 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1b65447-5db8-480f-a0d5-17a674f2c401-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330774 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rg9\" (UniqueName: \"kubernetes.io/projected/b1b65447-5db8-480f-a0d5-17a674f2c401-kube-api-access-v2rg9\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330785 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330794 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.330802 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.331798 4755 scope.go:117] "RemoveContainer" containerID="00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.352167 4755 scope.go:117] "RemoveContainer" containerID="24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.359256 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data" (OuterVolumeSpecName: "config-data") pod "b1b65447-5db8-480f-a0d5-17a674f2c401" (UID: "b1b65447-5db8-480f-a0d5-17a674f2c401"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.379267 4755 scope.go:117] "RemoveContainer" containerID="f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54" Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.379957 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54\": container with ID starting with f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54 not found: ID does not exist" containerID="f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.380005 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54"} err="failed to get container status \"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54\": rpc error: code = NotFound desc = could not find container \"f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54\": container with ID starting with f3263c3376e04064185f63ccbf34beb60c7091c2035a7532143b2a6b11b61d54 not found: ID does not exist" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.380031 4755 scope.go:117] "RemoveContainer" containerID="00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce" Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.380356 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce\": container with ID starting with 00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce not found: ID does not exist" containerID="00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.380404 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce"} err="failed to get container status \"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce\": rpc error: code = NotFound desc = could not find container \"00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce\": container with ID starting with 00f9151698e1324421ae57b7b8c2ef94b9e1ec64034b512038f2a27315e0bfce not found: ID does not exist" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.380438 4755 scope.go:117] "RemoveContainer" containerID="24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275" Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.380804 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275\": container with ID starting with 24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275 not found: ID does not exist" containerID="24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.380837 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275"} err="failed to get container status \"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275\": rpc error: code = NotFound desc = could not find container \"24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275\": container with ID starting with 24ffdb24a4b1f45c324ee02d1996f1676002f63c177e50749e001f7366cd2275 not found: ID does not exist" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.432892 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b65447-5db8-480f-a0d5-17a674f2c401-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.654435 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.662816 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685243 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.685679 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-notification-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685695 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-notification-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.685707 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="proxy-httpd" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685713 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="proxy-httpd" Oct 06 08:39:19 crc kubenswrapper[4755]: E1006 08:39:19.685730 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-central-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685737 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-central-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685945 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="proxy-httpd" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.685986 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-central-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.686009 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" containerName="ceilometer-notification-agent" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.687744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.689558 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.690472 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.696922 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839631 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.839905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqm9x\" (UniqueName: \"kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.901613 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b65447-5db8-480f-a0d5-17a674f2c401" path="/var/lib/kubelet/pods/b1b65447-5db8-480f-a0d5-17a674f2c401/volumes" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.942371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.942879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqm9x\" (UniqueName: \"kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943517 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943806 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943858 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.943949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.944096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.947568 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.948103 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.948298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.948611 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:19 crc kubenswrapper[4755]: I1006 08:39:19.959727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqm9x\" (UniqueName: \"kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x\") pod \"ceilometer-0\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " pod="openstack/ceilometer-0" Oct 06 08:39:20 crc kubenswrapper[4755]: I1006 08:39:20.005351 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:20 crc kubenswrapper[4755]: I1006 08:39:20.521406 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.324097 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c6wg6" event={"ID":"9755bfc9-d53e-4848-8d4b-04fdef46a4ea","Type":"ContainerStarted","Data":"d93a136dc69d666c5a1b24a5ce09b5f163fa7c3a458a70322d40a2a2ece5d440"} Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.327822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerStarted","Data":"71aa47d747df5be3c76a087610bf4b107811675b078b57a020ce0ed5ef56fd55"} Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.348511 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-c6wg6" podStartSLOduration=6.061578686 podStartE2EDuration="42.348491109s" podCreationTimestamp="2025-10-06 08:38:39 +0000 UTC" firstStartedPulling="2025-10-06 08:38:43.625974488 +0000 UTC m=+980.455289702" lastFinishedPulling="2025-10-06 08:39:19.912886901 +0000 UTC m=+1016.742202125" observedRunningTime="2025-10-06 08:39:21.344707753 +0000 UTC m=+1018.174022987" watchObservedRunningTime="2025-10-06 08:39:21.348491109 +0000 UTC m=+1018.177806323" Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.837048 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.913452 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:39:21 crc kubenswrapper[4755]: I1006 08:39:21.913824 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="dnsmasq-dns" containerID="cri-o://7d3fc1e742754cda9a279a87c4ac907df4f637805d58ce84f78e865585e478ae" gracePeriod=10 Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.422399 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerStarted","Data":"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49"} Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.427164 4755 generic.go:334] "Generic (PLEG): container finished" podID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerID="7d3fc1e742754cda9a279a87c4ac907df4f637805d58ce84f78e865585e478ae" exitCode=0 Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.427212 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" event={"ID":"ffad361d-03f7-4ed8-938c-013349c3eab0","Type":"ContainerDied","Data":"7d3fc1e742754cda9a279a87c4ac907df4f637805d58ce84f78e865585e478ae"} Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.571860 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.712271 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config\") pod \"ffad361d-03f7-4ed8-938c-013349c3eab0\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.712383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc\") pod \"ffad361d-03f7-4ed8-938c-013349c3eab0\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.712415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb\") pod \"ffad361d-03f7-4ed8-938c-013349c3eab0\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.712456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb\") pod \"ffad361d-03f7-4ed8-938c-013349c3eab0\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.712548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppfr4\" (UniqueName: \"kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4\") pod \"ffad361d-03f7-4ed8-938c-013349c3eab0\" (UID: \"ffad361d-03f7-4ed8-938c-013349c3eab0\") " Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.724753 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4" (OuterVolumeSpecName: "kube-api-access-ppfr4") pod "ffad361d-03f7-4ed8-938c-013349c3eab0" (UID: "ffad361d-03f7-4ed8-938c-013349c3eab0"). InnerVolumeSpecName "kube-api-access-ppfr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.792704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ffad361d-03f7-4ed8-938c-013349c3eab0" (UID: "ffad361d-03f7-4ed8-938c-013349c3eab0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.798194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffad361d-03f7-4ed8-938c-013349c3eab0" (UID: "ffad361d-03f7-4ed8-938c-013349c3eab0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.815677 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppfr4\" (UniqueName: \"kubernetes.io/projected/ffad361d-03f7-4ed8-938c-013349c3eab0-kube-api-access-ppfr4\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.815719 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.815729 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.817312 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config" (OuterVolumeSpecName: "config") pod "ffad361d-03f7-4ed8-938c-013349c3eab0" (UID: "ffad361d-03f7-4ed8-938c-013349c3eab0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.826072 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ffad361d-03f7-4ed8-938c-013349c3eab0" (UID: "ffad361d-03f7-4ed8-938c-013349c3eab0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.917266 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:22 crc kubenswrapper[4755]: I1006 08:39:22.917304 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad361d-03f7-4ed8-938c-013349c3eab0-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.444668 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.444925 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-lxx5f" event={"ID":"ffad361d-03f7-4ed8-938c-013349c3eab0","Type":"ContainerDied","Data":"c72cf6323afcab861a193f63e66f7dec0a67e746a4516335473424385c10c219"} Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.445086 4755 scope.go:117] "RemoveContainer" containerID="7d3fc1e742754cda9a279a87c4ac907df4f637805d58ce84f78e865585e478ae" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.449524 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerStarted","Data":"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7"} Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.449596 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerStarted","Data":"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235"} Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.469564 4755 scope.go:117] "RemoveContainer" containerID="2cf34babb62f405eaf9271a4ca4b5c9a82266efbd9c98fdb3b4e828e84917f64" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.487725 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.499399 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-lxx5f"] Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.609543 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.684964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.685250 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7646c5cd7b-lvntf" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.820705 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:23 crc kubenswrapper[4755]: I1006 08:39:23.960133 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" path="/var/lib/kubelet/pods/ffad361d-03f7-4ed8-938c-013349c3eab0/volumes" Oct 06 08:39:25 crc kubenswrapper[4755]: I1006 08:39:25.471864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerStarted","Data":"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4"} Oct 06 08:39:25 crc kubenswrapper[4755]: I1006 08:39:25.472853 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:39:25 crc kubenswrapper[4755]: I1006 08:39:25.474132 4755 generic.go:334] "Generic (PLEG): container finished" podID="8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" containerID="94c8ad26391b632e5f7b578218c32b8a57b58c1653ba4ed7c4c638eef2b92a24" exitCode=0 Oct 06 08:39:25 crc kubenswrapper[4755]: I1006 08:39:25.474205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dsb4x" event={"ID":"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c","Type":"ContainerDied","Data":"94c8ad26391b632e5f7b578218c32b8a57b58c1653ba4ed7c4c638eef2b92a24"} Oct 06 08:39:25 crc kubenswrapper[4755]: I1006 08:39:25.502074 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.494208135 podStartE2EDuration="6.502017009s" podCreationTimestamp="2025-10-06 08:39:19 +0000 UTC" firstStartedPulling="2025-10-06 08:39:20.514608541 +0000 UTC m=+1017.343923755" lastFinishedPulling="2025-10-06 08:39:24.522417415 +0000 UTC m=+1021.351732629" observedRunningTime="2025-10-06 08:39:25.494136109 +0000 UTC m=+1022.323451323" watchObservedRunningTime="2025-10-06 08:39:25.502017009 +0000 UTC m=+1022.331332223" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.306025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.311894 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7567ddf88-bwsd4" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.439630 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.440171 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76b8d6c486-gk8d2" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api" containerID="cri-o://2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4" gracePeriod=30 Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.439939 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76b8d6c486-gk8d2" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api-log" containerID="cri-o://bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962" gracePeriod=30 Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.483819 4755 generic.go:334] "Generic (PLEG): container finished" podID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" containerID="d93a136dc69d666c5a1b24a5ce09b5f163fa7c3a458a70322d40a2a2ece5d440" exitCode=0 Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.484013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c6wg6" event={"ID":"9755bfc9-d53e-4848-8d4b-04fdef46a4ea","Type":"ContainerDied","Data":"d93a136dc69d666c5a1b24a5ce09b5f163fa7c3a458a70322d40a2a2ece5d440"} Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.857074 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.905423 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle\") pod \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.905536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg86g\" (UniqueName: \"kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g\") pod \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.905612 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config\") pod \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\" (UID: \"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c\") " Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.912249 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g" (OuterVolumeSpecName: "kube-api-access-cg86g") pod "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" (UID: "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c"). InnerVolumeSpecName "kube-api-access-cg86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.940042 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config" (OuterVolumeSpecName: "config") pod "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" (UID: "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:26 crc kubenswrapper[4755]: I1006 08:39:26.953833 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" (UID: "8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.013868 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.013922 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg86g\" (UniqueName: \"kubernetes.io/projected/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-kube-api-access-cg86g\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.013936 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.492521 4755 generic.go:334] "Generic (PLEG): container finished" podID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerID="bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962" exitCode=143 Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.492602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerDied","Data":"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962"} Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.495471 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dsb4x" event={"ID":"8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c","Type":"ContainerDied","Data":"75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea"} Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.495508 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dsb4x" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.495511 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75bbc4d02559b04f2b2dbd1efe18c002a9d7ad082eced851d632d7b89efb1eea" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.772427 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:27 crc kubenswrapper[4755]: E1006 08:39:27.774901 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" containerName="neutron-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.774926 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" containerName="neutron-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: E1006 08:39:27.774947 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="init" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.774956 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="init" Oct 06 08:39:27 crc kubenswrapper[4755]: E1006 08:39:27.775065 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="dnsmasq-dns" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.775079 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="dnsmasq-dns" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.775311 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffad361d-03f7-4ed8-938c-013349c3eab0" containerName="dnsmasq-dns" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.775327 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" containerName="neutron-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.777365 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.808993 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.836664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvm5m\" (UniqueName: \"kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.836712 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.836807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.836831 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.836863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.914323 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.930638 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:39:27 crc kubenswrapper[4755]: E1006 08:39:27.931170 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" containerName="cinder-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.931194 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" containerName="cinder-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.931459 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" containerName="cinder-db-sync" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.932636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.936155 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.936449 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c9flv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.936723 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.936936 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937066 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937711 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937926 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.937986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flrfk\" (UniqueName: \"kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data\") pod \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\" (UID: \"9755bfc9-d53e-4848-8d4b-04fdef46a4ea\") " Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938304 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvm5m\" (UniqueName: \"kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938419 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.938653 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.941606 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.944860 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.946884 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.946910 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.970296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.971718 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts" (OuterVolumeSpecName: "scripts") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.975193 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk" (OuterVolumeSpecName: "kube-api-access-flrfk") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "kube-api-access-flrfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.975421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:27 crc kubenswrapper[4755]: I1006 08:39:27.991011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvm5m\" (UniqueName: \"kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m\") pod \"dnsmasq-dns-6bb684768f-k4cdv\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.000624 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.013052 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data" (OuterVolumeSpecName: "config-data") pod "9755bfc9-d53e-4848-8d4b-04fdef46a4ea" (UID: "9755bfc9-d53e-4848-8d4b-04fdef46a4ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.040490 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfhk\" (UniqueName: \"kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.040862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.040898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.040932 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041390 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041405 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041415 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041426 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041436 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flrfk\" (UniqueName: \"kubernetes.io/projected/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-kube-api-access-flrfk\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.041446 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9755bfc9-d53e-4848-8d4b-04fdef46a4ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.142509 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrfhk\" (UniqueName: \"kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.142626 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.142668 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.142711 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.142778 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.146338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.147889 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.150006 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.151909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.163009 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrfhk\" (UniqueName: \"kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk\") pod \"neutron-74bd7fb97b-tzfvn\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.205591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.346957 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.513610 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c6wg6" event={"ID":"9755bfc9-d53e-4848-8d4b-04fdef46a4ea","Type":"ContainerDied","Data":"54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca"} Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.513912 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54416962072eba1abe39696c7cb889aeec7f2a5bbcf99deaaf2084229759f0ca" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.513663 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c6wg6" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.682146 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.805789 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.807264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.811946 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.812657 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.812854 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9fgk5" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.819269 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.826716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866557 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866763 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866805 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.866870 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jsb\" (UniqueName: \"kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.917734 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.945221 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.950147 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.960278 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksc6b\" (UniqueName: \"kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971669 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jsb\" (UniqueName: \"kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971915 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.971930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.972015 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.972034 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.976747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.982622 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.993791 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:28 crc kubenswrapper[4755]: I1006 08:39:28.997031 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.032386 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.035524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jsb\" (UniqueName: \"kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb\") pod \"cinder-scheduler-0\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.071754 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075261 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksc6b\" (UniqueName: \"kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.075547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.076555 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.077450 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.078038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.078381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.079305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.118204 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.119288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksc6b\" (UniqueName: \"kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b\") pod \"dnsmasq-dns-6d97fcdd8f-989gj\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.176763 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hnv\" (UniqueName: \"kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177183 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177219 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.177236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.179966 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:39:29 crc kubenswrapper[4755]: W1006 08:39:29.191359 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794ddd23_a887_4125_a7de_c9281188c8ea.slice/crio-505cfc49ed0cb84bd48121fd4cb9875b38260b745c062f47afbdd9eb9d53703d WatchSource:0}: Error finding container 505cfc49ed0cb84bd48121fd4cb9875b38260b745c062f47afbdd9eb9d53703d: Status 404 returned error can't find the container with id 505cfc49ed0cb84bd48121fd4cb9875b38260b745c062f47afbdd9eb9d53703d Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.219013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278913 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.278965 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.279004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.279022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hnv\" (UniqueName: \"kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.280318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.285205 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.286215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.287138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.287817 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.304789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hnv\" (UniqueName: \"kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv\") pod \"cinder-api-0\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.355679 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.392847 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.551952 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerStarted","Data":"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149"} Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.551996 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerStarted","Data":"505cfc49ed0cb84bd48121fd4cb9875b38260b745c062f47afbdd9eb9d53703d"} Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.587643 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" event={"ID":"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737","Type":"ContainerDied","Data":"fe00796cada4665acc02819b2c60a6c10f8dd06c56b7159ec3b8d625c20d1395"} Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.587782 4755 generic.go:334] "Generic (PLEG): container finished" podID="ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" containerID="fe00796cada4665acc02819b2c60a6c10f8dd06c56b7159ec3b8d625c20d1395" exitCode=0 Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.588179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" event={"ID":"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737","Type":"ContainerStarted","Data":"451f463cc1a0a6b89b66be23b3d9d4ea1b6684b4c718ebe2e9d3bcaea517dfae"} Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.662946 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76b8d6c486-gk8d2" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:39190->10.217.0.145:9311: read: connection reset by peer" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.663078 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76b8d6c486-gk8d2" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:39192->10.217.0.145:9311: read: connection reset by peer" Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.675967 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.689905 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:29 crc kubenswrapper[4755]: I1006 08:39:29.772970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.009775 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.197929 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc\") pod \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.203037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb\") pod \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.203117 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb\") pod \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.203186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvm5m\" (UniqueName: \"kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m\") pod \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.203476 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config\") pod \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\" (UID: \"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.212333 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m" (OuterVolumeSpecName: "kube-api-access-nvm5m") pod "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" (UID: "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737"). InnerVolumeSpecName "kube-api-access-nvm5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.272144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" (UID: "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.278708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config" (OuterVolumeSpecName: "config") pod "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" (UID: "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.287086 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" (UID: "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.292189 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.299984 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" (UID: "ab7d7c8d-7a23-44dc-8c9a-12eb45d16737"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.307617 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.307677 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.307692 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.307705 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.307723 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvm5m\" (UniqueName: \"kubernetes.io/projected/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737-kube-api-access-nvm5m\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.408499 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs\") pod \"1efe6eb8-9918-4061-ad60-7b22276e99a1\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.408551 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle\") pod \"1efe6eb8-9918-4061-ad60-7b22276e99a1\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.408671 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom\") pod \"1efe6eb8-9918-4061-ad60-7b22276e99a1\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.408768 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv6dt\" (UniqueName: \"kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt\") pod \"1efe6eb8-9918-4061-ad60-7b22276e99a1\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.408841 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data\") pod \"1efe6eb8-9918-4061-ad60-7b22276e99a1\" (UID: \"1efe6eb8-9918-4061-ad60-7b22276e99a1\") " Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.413426 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs" (OuterVolumeSpecName: "logs") pod "1efe6eb8-9918-4061-ad60-7b22276e99a1" (UID: "1efe6eb8-9918-4061-ad60-7b22276e99a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.415775 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1efe6eb8-9918-4061-ad60-7b22276e99a1" (UID: "1efe6eb8-9918-4061-ad60-7b22276e99a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.416447 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt" (OuterVolumeSpecName: "kube-api-access-hv6dt") pod "1efe6eb8-9918-4061-ad60-7b22276e99a1" (UID: "1efe6eb8-9918-4061-ad60-7b22276e99a1"). InnerVolumeSpecName "kube-api-access-hv6dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.439390 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1efe6eb8-9918-4061-ad60-7b22276e99a1" (UID: "1efe6eb8-9918-4061-ad60-7b22276e99a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.454532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data" (OuterVolumeSpecName: "config-data") pod "1efe6eb8-9918-4061-ad60-7b22276e99a1" (UID: "1efe6eb8-9918-4061-ad60-7b22276e99a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.510924 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.510959 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv6dt\" (UniqueName: \"kubernetes.io/projected/1efe6eb8-9918-4061-ad60-7b22276e99a1-kube-api-access-hv6dt\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.510969 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.510977 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1efe6eb8-9918-4061-ad60-7b22276e99a1-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.510986 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1efe6eb8-9918-4061-ad60-7b22276e99a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.601153 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerStarted","Data":"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.601643 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.604091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerStarted","Data":"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.604150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerStarted","Data":"02ec3ab499792c19defb9c7d61b91940ac0538886dc7647a88693919bb290362"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.607113 4755 generic.go:334] "Generic (PLEG): container finished" podID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerID="2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4" exitCode=0 Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.607173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerDied","Data":"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.607178 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76b8d6c486-gk8d2" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.607320 4755 scope.go:117] "RemoveContainer" containerID="2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.610189 4755 generic.go:334] "Generic (PLEG): container finished" podID="e18674b4-2633-4819-8e4c-81e122186c0b" containerID="a3d8abe09ae4852061ef69995bb95d944415a4594c9fb66cd0b955b48752f742" exitCode=0 Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.607199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76b8d6c486-gk8d2" event={"ID":"1efe6eb8-9918-4061-ad60-7b22276e99a1","Type":"ContainerDied","Data":"1c5e44755e6fb9f5f0a66b616f3fb63dc30c09c1e18eb5924b024b93f83a35c5"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.610854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" event={"ID":"e18674b4-2633-4819-8e4c-81e122186c0b","Type":"ContainerDied","Data":"a3d8abe09ae4852061ef69995bb95d944415a4594c9fb66cd0b955b48752f742"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.610914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" event={"ID":"e18674b4-2633-4819-8e4c-81e122186c0b","Type":"ContainerStarted","Data":"667332f85c4483a39343b8fcfd3d9f7d68689c5f7d4efe6916661f27cc5d281a"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.616733 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" event={"ID":"ab7d7c8d-7a23-44dc-8c9a-12eb45d16737","Type":"ContainerDied","Data":"451f463cc1a0a6b89b66be23b3d9d4ea1b6684b4c718ebe2e9d3bcaea517dfae"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.624050 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-k4cdv" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.631591 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74bd7fb97b-tzfvn" podStartSLOduration=3.631557512 podStartE2EDuration="3.631557512s" podCreationTimestamp="2025-10-06 08:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:30.62161826 +0000 UTC m=+1027.450933474" watchObservedRunningTime="2025-10-06 08:39:30.631557512 +0000 UTC m=+1027.460872726" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.634279 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerStarted","Data":"0942bce3708e3c25de48d47456668afb85e43b163aeeec48a5cdc35ebd198781"} Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.692325 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.718114 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76b8d6c486-gk8d2"] Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.738616 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.745849 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-k4cdv"] Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.808720 4755 scope.go:117] "RemoveContainer" containerID="bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.921636 4755 scope.go:117] "RemoveContainer" containerID="2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4" Oct 06 08:39:30 crc kubenswrapper[4755]: E1006 08:39:30.922333 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4\": container with ID starting with 2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4 not found: ID does not exist" containerID="2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.922373 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4"} err="failed to get container status \"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4\": rpc error: code = NotFound desc = could not find container \"2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4\": container with ID starting with 2d33c949525d8849955afacf4b3177e5a7621df526948c93d6ceb699b6f81af4 not found: ID does not exist" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.922399 4755 scope.go:117] "RemoveContainer" containerID="bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962" Oct 06 08:39:30 crc kubenswrapper[4755]: E1006 08:39:30.922873 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962\": container with ID starting with bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962 not found: ID does not exist" containerID="bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.922901 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962"} err="failed to get container status \"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962\": rpc error: code = NotFound desc = could not find container \"bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962\": container with ID starting with bb8eac9f031b4bdef5a53f7ce84a12e1231f032e94ed0e1988f57f55ff61e962 not found: ID does not exist" Oct 06 08:39:30 crc kubenswrapper[4755]: I1006 08:39:30.922919 4755 scope.go:117] "RemoveContainer" containerID="fe00796cada4665acc02819b2c60a6c10f8dd06c56b7159ec3b8d625c20d1395" Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.645848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" event={"ID":"e18674b4-2633-4819-8e4c-81e122186c0b","Type":"ContainerStarted","Data":"54d3032fb98e75f87d1d15129afc65687b122c385be02c187f1f29a22bfea33e"} Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.646136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.649590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerStarted","Data":"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd"} Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.667222 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" podStartSLOduration=3.667202498 podStartE2EDuration="3.667202498s" podCreationTimestamp="2025-10-06 08:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:31.662859228 +0000 UTC m=+1028.492174442" watchObservedRunningTime="2025-10-06 08:39:31.667202498 +0000 UTC m=+1028.496517712" Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.685001 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.684976479 podStartE2EDuration="3.684976479s" podCreationTimestamp="2025-10-06 08:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:31.680944976 +0000 UTC m=+1028.510260200" watchObservedRunningTime="2025-10-06 08:39:31.684976479 +0000 UTC m=+1028.514291693" Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.891014 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" path="/var/lib/kubelet/pods/1efe6eb8-9918-4061-ad60-7b22276e99a1/volumes" Oct 06 08:39:31 crc kubenswrapper[4755]: I1006 08:39:31.892089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" path="/var/lib/kubelet/pods/ab7d7c8d-7a23-44dc-8c9a-12eb45d16737/volumes" Oct 06 08:39:32 crc kubenswrapper[4755]: I1006 08:39:32.550276 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:32 crc kubenswrapper[4755]: I1006 08:39:32.658835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerStarted","Data":"a3780e0e91031fad5949c60e2638fef313557c4bc63c2c053c52ce72f138d4fe"} Oct 06 08:39:32 crc kubenswrapper[4755]: I1006 08:39:32.659807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerStarted","Data":"7ba6e4a7c57d37919287a64b255c83b8470e2ad36899e5645b1d1d0386ef9ff5"} Oct 06 08:39:32 crc kubenswrapper[4755]: I1006 08:39:32.659840 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 08:39:32 crc kubenswrapper[4755]: I1006 08:39:32.678652 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.311952458 podStartE2EDuration="4.678633239s" podCreationTimestamp="2025-10-06 08:39:28 +0000 UTC" firstStartedPulling="2025-10-06 08:39:29.754991591 +0000 UTC m=+1026.584306805" lastFinishedPulling="2025-10-06 08:39:31.121672372 +0000 UTC m=+1027.950987586" observedRunningTime="2025-10-06 08:39:32.676443744 +0000 UTC m=+1029.505758958" watchObservedRunningTime="2025-10-06 08:39:32.678633239 +0000 UTC m=+1029.507948453" Oct 06 08:39:33 crc kubenswrapper[4755]: I1006 08:39:33.666658 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api-log" containerID="cri-o://d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" gracePeriod=30 Oct 06 08:39:33 crc kubenswrapper[4755]: I1006 08:39:33.668405 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api" containerID="cri-o://bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" gracePeriod=30 Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.219321 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.238128 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.332645 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6854f5796f-f7f5s"] Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.333048 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333067 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.333088 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" containerName="init" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" containerName="init" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.333112 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.333139 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333148 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.333169 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333390 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333407 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerName="cinder-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333416 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api-log" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333429 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efe6eb8-9918-4061-ad60-7b22276e99a1" containerName="barbican-api" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.333440 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7d7c8d-7a23-44dc-8c9a-12eb45d16737" containerName="init" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.335784 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.343273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.343492 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.346438 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854f5796f-f7f5s"] Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.398741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.398820 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hnv\" (UniqueName: \"kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.398899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.399007 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.399047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.399091 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.399131 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data\") pod \"8cbaa162-27f8-402b-a056-cf6a3491478f\" (UID: \"8cbaa162-27f8-402b-a056-cf6a3491478f\") " Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.408594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs" (OuterVolumeSpecName: "logs") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.399246 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.410493 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.411145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv" (OuterVolumeSpecName: "kube-api-access-l5hnv") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "kube-api-access-l5hnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.427194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts" (OuterVolumeSpecName: "scripts") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.441960 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.462425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data" (OuterVolumeSpecName: "config-data") pod "8cbaa162-27f8-402b-a056-cf6a3491478f" (UID: "8cbaa162-27f8-402b-a056-cf6a3491478f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-httpd-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509439 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-ovndb-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509539 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-internal-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-combined-ca-bundle\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-public-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509702 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ncn\" (UniqueName: \"kubernetes.io/projected/7250ab29-650e-46db-ba50-5d20579db8b6-kube-api-access-x6ncn\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509747 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5hnv\" (UniqueName: \"kubernetes.io/projected/8cbaa162-27f8-402b-a056-cf6a3491478f-kube-api-access-l5hnv\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509760 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cbaa162-27f8-402b-a056-cf6a3491478f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509769 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509776 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cbaa162-27f8-402b-a056-cf6a3491478f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509784 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509793 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.509803 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbaa162-27f8-402b-a056-cf6a3491478f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-internal-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-combined-ca-bundle\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-public-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ncn\" (UniqueName: \"kubernetes.io/projected/7250ab29-650e-46db-ba50-5d20579db8b6-kube-api-access-x6ncn\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612677 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-httpd-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.612784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-ovndb-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.616204 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-ovndb-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.616216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-internal-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.616468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.617487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-combined-ca-bundle\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.618079 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-public-tls-certs\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.618905 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7250ab29-650e-46db-ba50-5d20579db8b6-httpd-config\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.629541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ncn\" (UniqueName: \"kubernetes.io/projected/7250ab29-650e-46db-ba50-5d20579db8b6-kube-api-access-x6ncn\") pod \"neutron-6854f5796f-f7f5s\" (UID: \"7250ab29-650e-46db-ba50-5d20579db8b6\") " pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.656461 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684695 4755 generic.go:334] "Generic (PLEG): container finished" podID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerID="bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" exitCode=0 Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684737 4755 generic.go:334] "Generic (PLEG): container finished" podID="8cbaa162-27f8-402b-a056-cf6a3491478f" containerID="d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" exitCode=143 Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerDied","Data":"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd"} Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerDied","Data":"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef"} Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684824 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8cbaa162-27f8-402b-a056-cf6a3491478f","Type":"ContainerDied","Data":"02ec3ab499792c19defb9c7d61b91940ac0538886dc7647a88693919bb290362"} Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.684841 4755 scope.go:117] "RemoveContainer" containerID="bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.685935 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.710012 4755 scope.go:117] "RemoveContainer" containerID="d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.726965 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.735557 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.747616 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.749223 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.755820 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.756238 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.756468 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.773902 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.783753 4755 scope.go:117] "RemoveContainer" containerID="bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.787322 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd\": container with ID starting with bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd not found: ID does not exist" containerID="bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.787378 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd"} err="failed to get container status \"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd\": rpc error: code = NotFound desc = could not find container \"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd\": container with ID starting with bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd not found: ID does not exist" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.787410 4755 scope.go:117] "RemoveContainer" containerID="d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" Oct 06 08:39:34 crc kubenswrapper[4755]: E1006 08:39:34.791806 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef\": container with ID starting with d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef not found: ID does not exist" containerID="d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.791853 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef"} err="failed to get container status \"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef\": rpc error: code = NotFound desc = could not find container \"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef\": container with ID starting with d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef not found: ID does not exist" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.791878 4755 scope.go:117] "RemoveContainer" containerID="bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.792141 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd"} err="failed to get container status \"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd\": rpc error: code = NotFound desc = could not find container \"bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd\": container with ID starting with bed789797305b6da9f198b7765088f924625865478770e85cdbe040eccff02dd not found: ID does not exist" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.792167 4755 scope.go:117] "RemoveContainer" containerID="d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.792410 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef"} err="failed to get container status \"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef\": rpc error: code = NotFound desc = could not find container \"d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef\": container with ID starting with d83bf6c85f6906fabf014586c7e4ec5d45bc041202c47d24a605b52c3c214aef not found: ID does not exist" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a80bf3-70cf-465d-a429-caf75c375027-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.920889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dk8p\" (UniqueName: \"kubernetes.io/projected/d3a80bf3-70cf-465d-a429-caf75c375027-kube-api-access-6dk8p\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.921097 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-scripts\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.921200 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a80bf3-70cf-465d-a429-caf75c375027-logs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:34 crc kubenswrapper[4755]: I1006 08:39:34.921280 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.022888 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.022949 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a80bf3-70cf-465d-a429-caf75c375027-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.022973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.022991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023051 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dk8p\" (UniqueName: \"kubernetes.io/projected/d3a80bf3-70cf-465d-a429-caf75c375027-kube-api-access-6dk8p\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-scripts\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a80bf3-70cf-465d-a429-caf75c375027-logs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023657 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3a80bf3-70cf-465d-a429-caf75c375027-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.023933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a80bf3-70cf-465d-a429-caf75c375027-logs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.027469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.027865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.028297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-scripts\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.030747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.032496 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-config-data\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.039101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a80bf3-70cf-465d-a429-caf75c375027-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.055899 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dk8p\" (UniqueName: \"kubernetes.io/projected/d3a80bf3-70cf-465d-a429-caf75c375027-kube-api-access-6dk8p\") pod \"cinder-api-0\" (UID: \"d3a80bf3-70cf-465d-a429-caf75c375027\") " pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.136392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.234731 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854f5796f-f7f5s"] Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.617362 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:35 crc kubenswrapper[4755]: W1006 08:39:35.622407 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a80bf3_70cf_465d_a429_caf75c375027.slice/crio-5a4032cb55b7a73d0a3fa8426b01d3164bdaa5da9410f56e9822d348ff103414 WatchSource:0}: Error finding container 5a4032cb55b7a73d0a3fa8426b01d3164bdaa5da9410f56e9822d348ff103414: Status 404 returned error can't find the container with id 5a4032cb55b7a73d0a3fa8426b01d3164bdaa5da9410f56e9822d348ff103414 Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.703083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854f5796f-f7f5s" event={"ID":"7250ab29-650e-46db-ba50-5d20579db8b6","Type":"ContainerStarted","Data":"501c48cbdad0c39887e158874d9f4c7d2a43e82bb674e85f937855689b18c99c"} Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.703536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854f5796f-f7f5s" event={"ID":"7250ab29-650e-46db-ba50-5d20579db8b6","Type":"ContainerStarted","Data":"746e2b737896edf8794940970333c2e6c2dde4a3b84416b35a505a393d597098"} Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.703556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854f5796f-f7f5s" event={"ID":"7250ab29-650e-46db-ba50-5d20579db8b6","Type":"ContainerStarted","Data":"1a0a63bec6af583bba3e6f63b1ecf75b29f076d6eb6a866f6b3eb51eee1102c9"} Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.703631 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.705375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3a80bf3-70cf-465d-a429-caf75c375027","Type":"ContainerStarted","Data":"5a4032cb55b7a73d0a3fa8426b01d3164bdaa5da9410f56e9822d348ff103414"} Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.738530 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6854f5796f-f7f5s" podStartSLOduration=1.738505683 podStartE2EDuration="1.738505683s" podCreationTimestamp="2025-10-06 08:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:35.728451827 +0000 UTC m=+1032.557767051" watchObservedRunningTime="2025-10-06 08:39:35.738505683 +0000 UTC m=+1032.567820897" Oct 06 08:39:35 crc kubenswrapper[4755]: I1006 08:39:35.897037 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbaa162-27f8-402b-a056-cf6a3491478f" path="/var/lib/kubelet/pods/8cbaa162-27f8-402b-a056-cf6a3491478f/volumes" Oct 06 08:39:36 crc kubenswrapper[4755]: I1006 08:39:36.439730 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-67f564d7bf-cx47l" Oct 06 08:39:36 crc kubenswrapper[4755]: I1006 08:39:36.729409 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3a80bf3-70cf-465d-a429-caf75c375027","Type":"ContainerStarted","Data":"52fbae12ad25a9c6e507f88a52ad4cd96134ac3522c6699489192851789dd4d5"} Oct 06 08:39:37 crc kubenswrapper[4755]: I1006 08:39:37.744443 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3a80bf3-70cf-465d-a429-caf75c375027","Type":"ContainerStarted","Data":"505e4a2eed9535c0852b2526270d112a061ff556334ca0d36d4c83acee6dab26"} Oct 06 08:39:37 crc kubenswrapper[4755]: I1006 08:39:37.745100 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 08:39:37 crc kubenswrapper[4755]: I1006 08:39:37.777212 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.777189037 podStartE2EDuration="3.777189037s" podCreationTimestamp="2025-10-06 08:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:37.766984227 +0000 UTC m=+1034.596299441" watchObservedRunningTime="2025-10-06 08:39:37.777189037 +0000 UTC m=+1034.606504251" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.439957 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.441456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.443539 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.444524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bw2xj" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.447171 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.469860 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.603688 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.603782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.603819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.603851 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfkcr\" (UniqueName: \"kubernetes.io/projected/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-kube-api-access-vfkcr\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.615082 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: E1006 08:39:38.615748 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-vfkcr openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.627359 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.641725 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.644001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.651778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.705462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.705543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707007 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config-secret\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfkcr\" (UniqueName: \"kubernetes.io/projected/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-kube-api-access-vfkcr\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6jp\" (UniqueName: \"kubernetes.io/projected/9805ff75-3e68-41eb-a711-ecc8e70ee16a-kube-api-access-4c6jp\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.707924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: E1006 08:39:38.709115 4755 projected.go:194] Error preparing data for projected volume kube-api-access-vfkcr for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd) does not match the UID in record. The object might have been deleted and then recreated Oct 06 08:39:38 crc kubenswrapper[4755]: E1006 08:39:38.709185 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-kube-api-access-vfkcr podName:93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd nodeName:}" failed. No retries permitted until 2025-10-06 08:39:39.209165453 +0000 UTC m=+1036.038480667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vfkcr" (UniqueName: "kubernetes.io/projected/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-kube-api-access-vfkcr") pod "openstackclient" (UID: "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd) does not match the UID in record. The object might have been deleted and then recreated Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.713816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.716844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.751922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.756381 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" podUID="9805ff75-3e68-41eb-a711-ecc8e70ee16a" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.809482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config-secret\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.809848 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6jp\" (UniqueName: \"kubernetes.io/projected/9805ff75-3e68-41eb-a711-ecc8e70ee16a-kube-api-access-4c6jp\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.810078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.810281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.811548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.813172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-openstack-config-secret\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.819121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9805ff75-3e68-41eb-a711-ecc8e70ee16a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.828087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6jp\" (UniqueName: \"kubernetes.io/projected/9805ff75-3e68-41eb-a711-ecc8e70ee16a-kube-api-access-4c6jp\") pod \"openstackclient\" (UID: \"9805ff75-3e68-41eb-a711-ecc8e70ee16a\") " pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.893419 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.911899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config\") pod \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.911965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret\") pod \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.912071 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle\") pod \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\" (UID: \"93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd\") " Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.912680 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfkcr\" (UniqueName: \"kubernetes.io/projected/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-kube-api-access-vfkcr\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.914427 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" (UID: "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.919518 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" (UID: "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.920653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" (UID: "93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4755]: I1006 08:39:38.959433 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.013981 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.014018 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.014030 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.356720 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.410789 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.411004 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="dnsmasq-dns" containerID="cri-o://9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a" gracePeriod=10 Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.443264 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.471016 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.511901 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:39 crc kubenswrapper[4755]: E1006 08:39:39.614144 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46024b0b_7959_469e_be3a_72570d94b1b2.slice/crio-conmon-9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.769109 4755 generic.go:334] "Generic (PLEG): container finished" podID="46024b0b-7959-469e-be3a-72570d94b1b2" containerID="9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a" exitCode=0 Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.769184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" event={"ID":"46024b0b-7959-469e-be3a-72570d94b1b2","Type":"ContainerDied","Data":"9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a"} Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.771208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9805ff75-3e68-41eb-a711-ecc8e70ee16a","Type":"ContainerStarted","Data":"402e6b6ba8744f86ae64d949d076056b13abe11d4e589380db4091076f70f54e"} Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.771237 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="cinder-scheduler" containerID="cri-o://7ba6e4a7c57d37919287a64b255c83b8470e2ad36899e5645b1d1d0386ef9ff5" gracePeriod=30 Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.771291 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.771756 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="probe" containerID="cri-o://a3780e0e91031fad5949c60e2638fef313557c4bc63c2c053c52ce72f138d4fe" gracePeriod=30 Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.775330 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" podUID="9805ff75-3e68-41eb-a711-ecc8e70ee16a" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.903621 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd" path="/var/lib/kubelet/pods/93908f8a-eaa4-46c9-9cb0-a43ffc5ec4cd/volumes" Oct 06 08:39:39 crc kubenswrapper[4755]: I1006 08:39:39.951844 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.030480 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb\") pod \"46024b0b-7959-469e-be3a-72570d94b1b2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.030537 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchhv\" (UniqueName: \"kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv\") pod \"46024b0b-7959-469e-be3a-72570d94b1b2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.031318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config\") pod \"46024b0b-7959-469e-be3a-72570d94b1b2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.031351 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb\") pod \"46024b0b-7959-469e-be3a-72570d94b1b2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.031419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc\") pod \"46024b0b-7959-469e-be3a-72570d94b1b2\" (UID: \"46024b0b-7959-469e-be3a-72570d94b1b2\") " Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.035968 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv" (OuterVolumeSpecName: "kube-api-access-zchhv") pod "46024b0b-7959-469e-be3a-72570d94b1b2" (UID: "46024b0b-7959-469e-be3a-72570d94b1b2"). InnerVolumeSpecName "kube-api-access-zchhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.076738 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config" (OuterVolumeSpecName: "config") pod "46024b0b-7959-469e-be3a-72570d94b1b2" (UID: "46024b0b-7959-469e-be3a-72570d94b1b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.091708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46024b0b-7959-469e-be3a-72570d94b1b2" (UID: "46024b0b-7959-469e-be3a-72570d94b1b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.097382 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46024b0b-7959-469e-be3a-72570d94b1b2" (UID: "46024b0b-7959-469e-be3a-72570d94b1b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.118001 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46024b0b-7959-469e-be3a-72570d94b1b2" (UID: "46024b0b-7959-469e-be3a-72570d94b1b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.132767 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.132795 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.132828 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchhv\" (UniqueName: \"kubernetes.io/projected/46024b0b-7959-469e-be3a-72570d94b1b2-kube-api-access-zchhv\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.132839 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.132847 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46024b0b-7959-469e-be3a-72570d94b1b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.781652 4755 generic.go:334] "Generic (PLEG): container finished" podID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerID="a3780e0e91031fad5949c60e2638fef313557c4bc63c2c053c52ce72f138d4fe" exitCode=0 Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.781760 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerDied","Data":"a3780e0e91031fad5949c60e2638fef313557c4bc63c2c053c52ce72f138d4fe"} Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.799060 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" event={"ID":"46024b0b-7959-469e-be3a-72570d94b1b2","Type":"ContainerDied","Data":"036afbb03ff4b1357a5e9c2ca57548ff975fcbe75fd7c8106d2bed3b1226e45c"} Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.799130 4755 scope.go:117] "RemoveContainer" containerID="9a582759371a9ac69d3e8fb77e85d135abaab22ab0ec19e4764bf1ee9f607e2a" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.799132 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-4qjn2" Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.846686 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.859159 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-4qjn2"] Oct 06 08:39:40 crc kubenswrapper[4755]: I1006 08:39:40.875722 4755 scope.go:117] "RemoveContainer" containerID="fce724e0a30e9c89d584227dcc8aa4096438bbeb0999bdf17e4abf4093ce120f" Oct 06 08:39:41 crc kubenswrapper[4755]: I1006 08:39:41.890958 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" path="/var/lib/kubelet/pods/46024b0b-7959-469e-be3a-72570d94b1b2/volumes" Oct 06 08:39:42 crc kubenswrapper[4755]: I1006 08:39:42.828968 4755 generic.go:334] "Generic (PLEG): container finished" podID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerID="7ba6e4a7c57d37919287a64b255c83b8470e2ad36899e5645b1d1d0386ef9ff5" exitCode=0 Oct 06 08:39:42 crc kubenswrapper[4755]: I1006 08:39:42.829010 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerDied","Data":"7ba6e4a7c57d37919287a64b255c83b8470e2ad36899e5645b1d1d0386ef9ff5"} Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.304645 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.411353 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.411413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.412206 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.412254 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.412294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jsb\" (UniqueName: \"kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.412431 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom\") pod \"a1d3acf3-366b-4613-a557-e698a633c5a2\" (UID: \"a1d3acf3-366b-4613-a557-e698a633c5a2\") " Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.412295 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.418640 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.427765 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb" (OuterVolumeSpecName: "kube-api-access-p7jsb") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "kube-api-access-p7jsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.428692 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts" (OuterVolumeSpecName: "scripts") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.484500 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.515763 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.515797 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d3acf3-366b-4613-a557-e698a633c5a2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.515810 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.515823 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7jsb\" (UniqueName: \"kubernetes.io/projected/a1d3acf3-366b-4613-a557-e698a633c5a2-kube-api-access-p7jsb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.515838 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.525646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data" (OuterVolumeSpecName: "config-data") pod "a1d3acf3-366b-4613-a557-e698a633c5a2" (UID: "a1d3acf3-366b-4613-a557-e698a633c5a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.626181 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d3acf3-366b-4613-a557-e698a633c5a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.867452 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a1d3acf3-366b-4613-a557-e698a633c5a2","Type":"ContainerDied","Data":"0942bce3708e3c25de48d47456668afb85e43b163aeeec48a5cdc35ebd198781"} Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.867517 4755 scope.go:117] "RemoveContainer" containerID="a3780e0e91031fad5949c60e2638fef313557c4bc63c2c053c52ce72f138d4fe" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.867714 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.939709 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.961369 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.975334 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:43 crc kubenswrapper[4755]: E1006 08:39:43.975815 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="init" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.975838 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="init" Oct 06 08:39:43 crc kubenswrapper[4755]: E1006 08:39:43.975863 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="probe" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.975871 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="probe" Oct 06 08:39:43 crc kubenswrapper[4755]: E1006 08:39:43.975903 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="cinder-scheduler" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.975911 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="cinder-scheduler" Oct 06 08:39:43 crc kubenswrapper[4755]: E1006 08:39:43.975919 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="dnsmasq-dns" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.975925 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="dnsmasq-dns" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.976118 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="46024b0b-7959-469e-be3a-72570d94b1b2" containerName="dnsmasq-dns" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.976147 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="probe" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.976164 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" containerName="cinder-scheduler" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.977274 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.981272 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 08:39:43 crc kubenswrapper[4755]: I1006 08:39:43.992738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.041721 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.041834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.041896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.041923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rr9p\" (UniqueName: \"kubernetes.io/projected/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-kube-api-access-2rr9p\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.042015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.042052 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.143877 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.143929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.143980 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rr9p\" (UniqueName: \"kubernetes.io/projected/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-kube-api-access-2rr9p\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.144001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.144019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.144115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.144181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.148407 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.148717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.157304 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.161302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rr9p\" (UniqueName: \"kubernetes.io/projected/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-kube-api-access-2rr9p\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.164772 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb717d16-7646-4b16-b1fb-fb2f580ceaf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb717d16-7646-4b16-b1fb-fb2f580ceaf9\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.225372 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t749f"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.226495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.236064 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t749f"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.245676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzgr\" (UniqueName: \"kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr\") pod \"nova-api-db-create-t749f\" (UID: \"22360df0-a5e3-45dd-95b7-ddec07373964\") " pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.304246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.314481 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9hrkm"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.315684 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.338377 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hrkm"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.347774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzgr\" (UniqueName: \"kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr\") pod \"nova-api-db-create-t749f\" (UID: \"22360df0-a5e3-45dd-95b7-ddec07373964\") " pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.347933 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgw6\" (UniqueName: \"kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6\") pod \"nova-cell0-db-create-9hrkm\" (UID: \"092ec804-1b49-4994-94f3-2051535bb3bf\") " pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.363761 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzgr\" (UniqueName: \"kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr\") pod \"nova-api-db-create-t749f\" (UID: \"22360df0-a5e3-45dd-95b7-ddec07373964\") " pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.423052 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p85gv"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.424076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.451543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgw6\" (UniqueName: \"kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6\") pod \"nova-cell0-db-create-9hrkm\" (UID: \"092ec804-1b49-4994-94f3-2051535bb3bf\") " pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.451727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fnk\" (UniqueName: \"kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk\") pod \"nova-cell1-db-create-p85gv\" (UID: \"20cf827c-cb1d-42b1-a5e0-63854c591bdf\") " pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.457724 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p85gv"] Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.475741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgw6\" (UniqueName: \"kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6\") pod \"nova-cell0-db-create-9hrkm\" (UID: \"092ec804-1b49-4994-94f3-2051535bb3bf\") " pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.549110 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.552807 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fnk\" (UniqueName: \"kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk\") pod \"nova-cell1-db-create-p85gv\" (UID: \"20cf827c-cb1d-42b1-a5e0-63854c591bdf\") " pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.569059 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fnk\" (UniqueName: \"kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk\") pod \"nova-cell1-db-create-p85gv\" (UID: \"20cf827c-cb1d-42b1-a5e0-63854c591bdf\") " pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.630659 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:44 crc kubenswrapper[4755]: I1006 08:39:44.760670 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:45 crc kubenswrapper[4755]: I1006 08:39:45.893533 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d3acf3-366b-4613-a557-e698a633c5a2" path="/var/lib/kubelet/pods/a1d3acf3-366b-4613-a557-e698a633c5a2/volumes" Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.118413 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.118688 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-central-agent" containerID="cri-o://fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49" gracePeriod=30 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.118735 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="sg-core" containerID="cri-o://70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7" gracePeriod=30 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.118810 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="proxy-httpd" containerID="cri-o://8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4" gracePeriod=30 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.118812 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-notification-agent" containerID="cri-o://4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235" gracePeriod=30 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.127921 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895196 4755 generic.go:334] "Generic (PLEG): container finished" podID="cde5598c-3b31-4691-b149-7602575c7ff4" containerID="8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4" exitCode=0 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895231 4755 generic.go:334] "Generic (PLEG): container finished" podID="cde5598c-3b31-4691-b149-7602575c7ff4" containerID="70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7" exitCode=2 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895245 4755 generic.go:334] "Generic (PLEG): container finished" podID="cde5598c-3b31-4691-b149-7602575c7ff4" containerID="fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49" exitCode=0 Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerDied","Data":"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4"} Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerDied","Data":"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7"} Oct 06 08:39:46 crc kubenswrapper[4755]: I1006 08:39:46.895325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerDied","Data":"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49"} Oct 06 08:39:47 crc kubenswrapper[4755]: I1006 08:39:47.148508 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 08:39:48 crc kubenswrapper[4755]: I1006 08:39:48.912288 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:39:48 crc kubenswrapper[4755]: I1006 08:39:48.912694 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:39:48 crc kubenswrapper[4755]: I1006 08:39:48.912745 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:39:48 crc kubenswrapper[4755]: I1006 08:39:48.913415 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:39:48 crc kubenswrapper[4755]: I1006 08:39:48.913470 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5" gracePeriod=600 Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.467173 4755 scope.go:117] "RemoveContainer" containerID="7ba6e4a7c57d37919287a64b255c83b8470e2ad36899e5645b1d1d0386ef9ff5" Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.921678 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5" exitCode=0 Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.922077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5"} Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.922441 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64"} Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.922472 4755 scope.go:117] "RemoveContainer" containerID="d429b678b36d347ceb6d82738a5216f8e1c07a0afd1e703d9e929f6a065850ec" Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.925310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9805ff75-3e68-41eb-a711-ecc8e70ee16a","Type":"ContainerStarted","Data":"405eef49ad5c653da7bf77239cac4183ae648a3582ac1abac03a917f2a7633ca"} Oct 06 08:39:49 crc kubenswrapper[4755]: I1006 08:39:49.954481 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.8012013740000001 podStartE2EDuration="11.954447333s" podCreationTimestamp="2025-10-06 08:39:38 +0000 UTC" firstStartedPulling="2025-10-06 08:39:39.437778912 +0000 UTC m=+1036.267094126" lastFinishedPulling="2025-10-06 08:39:49.591024871 +0000 UTC m=+1046.420340085" observedRunningTime="2025-10-06 08:39:49.951628202 +0000 UTC m=+1046.780943426" watchObservedRunningTime="2025-10-06 08:39:49.954447333 +0000 UTC m=+1046.783762547" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.007967 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.147:3000/\": dial tcp 10.217.0.147:3000: connect: connection refused" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.018761 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hrkm"] Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.136502 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t749f"] Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.145364 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:50 crc kubenswrapper[4755]: W1006 08:39:50.148446 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb717d16_7646_4b16_b1fb_fb2f580ceaf9.slice/crio-73cdd1f445ca4dcf3def31e8e8ae993e32b4c7e4d8ec07664efeb1436730cf6c WatchSource:0}: Error finding container 73cdd1f445ca4dcf3def31e8e8ae993e32b4c7e4d8ec07664efeb1436730cf6c: Status 404 returned error can't find the container with id 73cdd1f445ca4dcf3def31e8e8ae993e32b4c7e4d8ec07664efeb1436730cf6c Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.327984 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p85gv"] Oct 06 08:39:50 crc kubenswrapper[4755]: W1006 08:39:50.513820 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20cf827c_cb1d_42b1_a5e0_63854c591bdf.slice/crio-6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9 WatchSource:0}: Error finding container 6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9: Status 404 returned error can't find the container with id 6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9 Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.695798 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.795879 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796303 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796338 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796379 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqm9x\" (UniqueName: \"kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.796410 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd\") pod \"cde5598c-3b31-4691-b149-7602575c7ff4\" (UID: \"cde5598c-3b31-4691-b149-7602575c7ff4\") " Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.797302 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.798467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.803009 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x" (OuterVolumeSpecName: "kube-api-access-gqm9x") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "kube-api-access-gqm9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.822289 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts" (OuterVolumeSpecName: "scripts") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.846831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.899803 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.899838 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.899848 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqm9x\" (UniqueName: \"kubernetes.io/projected/cde5598c-3b31-4691-b149-7602575c7ff4-kube-api-access-gqm9x\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.899857 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cde5598c-3b31-4691-b149-7602575c7ff4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.899889 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.909740 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.943413 4755 generic.go:334] "Generic (PLEG): container finished" podID="092ec804-1b49-4994-94f3-2051535bb3bf" containerID="9ad411a47f3bac48efa94042cb58e8483025f5bca058e71c4176b1e09a989674" exitCode=0 Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.943774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hrkm" event={"ID":"092ec804-1b49-4994-94f3-2051535bb3bf","Type":"ContainerDied","Data":"9ad411a47f3bac48efa94042cb58e8483025f5bca058e71c4176b1e09a989674"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.943799 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hrkm" event={"ID":"092ec804-1b49-4994-94f3-2051535bb3bf","Type":"ContainerStarted","Data":"60b116e879d04275591aa09ea7bd7ee9d2a2a7f9d973db37027e70d22564a597"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.948448 4755 generic.go:334] "Generic (PLEG): container finished" podID="20cf827c-cb1d-42b1-a5e0-63854c591bdf" containerID="b325977df06f1ac602cb71fe4d337d10b6a540178b4cab1a1333239e2acb91be" exitCode=0 Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.948544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p85gv" event={"ID":"20cf827c-cb1d-42b1-a5e0-63854c591bdf","Type":"ContainerDied","Data":"b325977df06f1ac602cb71fe4d337d10b6a540178b4cab1a1333239e2acb91be"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.948590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p85gv" event={"ID":"20cf827c-cb1d-42b1-a5e0-63854c591bdf","Type":"ContainerStarted","Data":"6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.962821 4755 generic.go:334] "Generic (PLEG): container finished" podID="cde5598c-3b31-4691-b149-7602575c7ff4" containerID="4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235" exitCode=0 Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.962873 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.962891 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerDied","Data":"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.962915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cde5598c-3b31-4691-b149-7602575c7ff4","Type":"ContainerDied","Data":"71aa47d747df5be3c76a087610bf4b107811675b078b57a020ce0ed5ef56fd55"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.962932 4755 scope.go:117] "RemoveContainer" containerID="8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.964762 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data" (OuterVolumeSpecName: "config-data") pod "cde5598c-3b31-4691-b149-7602575c7ff4" (UID: "cde5598c-3b31-4691-b149-7602575c7ff4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.982054 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb717d16-7646-4b16-b1fb-fb2f580ceaf9","Type":"ContainerStarted","Data":"73cdd1f445ca4dcf3def31e8e8ae993e32b4c7e4d8ec07664efeb1436730cf6c"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.986221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t749f" event={"ID":"22360df0-a5e3-45dd-95b7-ddec07373964","Type":"ContainerStarted","Data":"602ebd4a59a29fd43ba48888cb66ba4e3ab2e58b06ce762f1bac8fd5ad26b4a2"} Oct 06 08:39:50 crc kubenswrapper[4755]: I1006 08:39:50.986248 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t749f" event={"ID":"22360df0-a5e3-45dd-95b7-ddec07373964","Type":"ContainerStarted","Data":"25a0a0bae157a8fc39ad55aebb432f068c7ef6c327bff239e9cf750b22ff2afd"} Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.001241 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.001276 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde5598c-3b31-4691-b149-7602575c7ff4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.056887 4755 scope.go:117] "RemoveContainer" containerID="70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.080458 4755 scope.go:117] "RemoveContainer" containerID="4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.103296 4755 scope.go:117] "RemoveContainer" containerID="fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.125920 4755 scope.go:117] "RemoveContainer" containerID="8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.126690 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4\": container with ID starting with 8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4 not found: ID does not exist" containerID="8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.126737 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4"} err="failed to get container status \"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4\": rpc error: code = NotFound desc = could not find container \"8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4\": container with ID starting with 8db20209351aed54da6776857bb55406d69418eb44a16142e5d48fd69a8a29a4 not found: ID does not exist" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.126765 4755 scope.go:117] "RemoveContainer" containerID="70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.127317 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7\": container with ID starting with 70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7 not found: ID does not exist" containerID="70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.127362 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7"} err="failed to get container status \"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7\": rpc error: code = NotFound desc = could not find container \"70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7\": container with ID starting with 70ec16889805bc93454292502c46a412ccadc7f10c1bd7d5b847d89b2fdbe2d7 not found: ID does not exist" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.127393 4755 scope.go:117] "RemoveContainer" containerID="4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.127765 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235\": container with ID starting with 4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235 not found: ID does not exist" containerID="4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.127803 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235"} err="failed to get container status \"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235\": rpc error: code = NotFound desc = could not find container \"4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235\": container with ID starting with 4c9250252c256b9d9196af452474e6d6e7ba49aebeb0278335a701581e21b235 not found: ID does not exist" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.127828 4755 scope.go:117] "RemoveContainer" containerID="fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.128113 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49\": container with ID starting with fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49 not found: ID does not exist" containerID="fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.128140 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49"} err="failed to get container status \"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49\": rpc error: code = NotFound desc = could not find container \"fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49\": container with ID starting with fc2e6db0b882631f699b716510c49aaf912424f1fd6b48f48d7e016bafc0dc49 not found: ID does not exist" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.301792 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.325131 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.331783 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.332146 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="sg-core" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332240 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="sg-core" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.332260 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-notification-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332266 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-notification-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.332274 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="proxy-httpd" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332280 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="proxy-httpd" Oct 06 08:39:51 crc kubenswrapper[4755]: E1006 08:39:51.332311 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-central-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332318 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-central-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332475 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-central-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332485 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="ceilometer-notification-agent" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332499 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="proxy-httpd" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.332510 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" containerName="sg-core" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.334014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.338988 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.339451 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.344436 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408472 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7pq\" (UniqueName: \"kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.408520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7pq\" (UniqueName: \"kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.510825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.511781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.511841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.517365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.527647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.534448 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.534822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.546580 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7pq\" (UniqueName: \"kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq\") pod \"ceilometer-0\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.656547 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:51 crc kubenswrapper[4755]: I1006 08:39:51.895753 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde5598c-3b31-4691-b149-7602575c7ff4" path="/var/lib/kubelet/pods/cde5598c-3b31-4691-b149-7602575c7ff4/volumes" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.003387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb717d16-7646-4b16-b1fb-fb2f580ceaf9","Type":"ContainerStarted","Data":"7b71ce233c9922808dc681a988190c51bc1890213a2470e8f82b26097d70a912"} Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.003461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb717d16-7646-4b16-b1fb-fb2f580ceaf9","Type":"ContainerStarted","Data":"332b9068f4ec86e8c3bebb979b28248ffeab07493eee40891e7d68015b28d21f"} Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.006320 4755 generic.go:334] "Generic (PLEG): container finished" podID="22360df0-a5e3-45dd-95b7-ddec07373964" containerID="602ebd4a59a29fd43ba48888cb66ba4e3ab2e58b06ce762f1bac8fd5ad26b4a2" exitCode=0 Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.006442 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t749f" event={"ID":"22360df0-a5e3-45dd-95b7-ddec07373964","Type":"ContainerDied","Data":"602ebd4a59a29fd43ba48888cb66ba4e3ab2e58b06ce762f1bac8fd5ad26b4a2"} Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.038020 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.038003714 podStartE2EDuration="9.038003714s" podCreationTimestamp="2025-10-06 08:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:52.037879561 +0000 UTC m=+1048.867194765" watchObservedRunningTime="2025-10-06 08:39:52.038003714 +0000 UTC m=+1048.867318928" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.219048 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.558808 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.644809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzgr\" (UniqueName: \"kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr\") pod \"22360df0-a5e3-45dd-95b7-ddec07373964\" (UID: \"22360df0-a5e3-45dd-95b7-ddec07373964\") " Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.651836 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr" (OuterVolumeSpecName: "kube-api-access-mdzgr") pod "22360df0-a5e3-45dd-95b7-ddec07373964" (UID: "22360df0-a5e3-45dd-95b7-ddec07373964"). InnerVolumeSpecName "kube-api-access-mdzgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.673967 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.684233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.746883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7fnk\" (UniqueName: \"kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk\") pod \"20cf827c-cb1d-42b1-a5e0-63854c591bdf\" (UID: \"20cf827c-cb1d-42b1-a5e0-63854c591bdf\") " Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.747194 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgw6\" (UniqueName: \"kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6\") pod \"092ec804-1b49-4994-94f3-2051535bb3bf\" (UID: \"092ec804-1b49-4994-94f3-2051535bb3bf\") " Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.747591 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzgr\" (UniqueName: \"kubernetes.io/projected/22360df0-a5e3-45dd-95b7-ddec07373964-kube-api-access-mdzgr\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.751058 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6" (OuterVolumeSpecName: "kube-api-access-lzgw6") pod "092ec804-1b49-4994-94f3-2051535bb3bf" (UID: "092ec804-1b49-4994-94f3-2051535bb3bf"). InnerVolumeSpecName "kube-api-access-lzgw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.751865 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk" (OuterVolumeSpecName: "kube-api-access-s7fnk") pod "20cf827c-cb1d-42b1-a5e0-63854c591bdf" (UID: "20cf827c-cb1d-42b1-a5e0-63854c591bdf"). InnerVolumeSpecName "kube-api-access-s7fnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.849725 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgw6\" (UniqueName: \"kubernetes.io/projected/092ec804-1b49-4994-94f3-2051535bb3bf-kube-api-access-lzgw6\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:52 crc kubenswrapper[4755]: I1006 08:39:52.849759 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7fnk\" (UniqueName: \"kubernetes.io/projected/20cf827c-cb1d-42b1-a5e0-63854c591bdf-kube-api-access-s7fnk\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.018556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerStarted","Data":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.018993 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerStarted","Data":"a719856406af91cacd5c3457b5a53fd5723205a6350484432b0b921f96ee47e2"} Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.021900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p85gv" event={"ID":"20cf827c-cb1d-42b1-a5e0-63854c591bdf","Type":"ContainerDied","Data":"6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9"} Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.021937 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p85gv" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.021942 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6543c1cb08a37d43b31c2db6e4c0d1ddaf94804ad8712a1b06b5e0b444e8acd9" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.025730 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t749f" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.025738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t749f" event={"ID":"22360df0-a5e3-45dd-95b7-ddec07373964","Type":"ContainerDied","Data":"25a0a0bae157a8fc39ad55aebb432f068c7ef6c327bff239e9cf750b22ff2afd"} Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.025782 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a0a0bae157a8fc39ad55aebb432f068c7ef6c327bff239e9cf750b22ff2afd" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.034527 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hrkm" event={"ID":"092ec804-1b49-4994-94f3-2051535bb3bf","Type":"ContainerDied","Data":"60b116e879d04275591aa09ea7bd7ee9d2a2a7f9d973db37027e70d22564a597"} Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.034591 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b116e879d04275591aa09ea7bd7ee9d2a2a7f9d973db37027e70d22564a597" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.034585 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hrkm" Oct 06 08:39:53 crc kubenswrapper[4755]: I1006 08:39:53.317691 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:54 crc kubenswrapper[4755]: I1006 08:39:54.043915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerStarted","Data":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} Oct 06 08:39:54 crc kubenswrapper[4755]: I1006 08:39:54.304720 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 08:39:55 crc kubenswrapper[4755]: I1006 08:39:55.053091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerStarted","Data":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerStarted","Data":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063958 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063808 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="proxy-httpd" containerID="cri-o://ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" gracePeriod=30 Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063808 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="sg-core" containerID="cri-o://4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" gracePeriod=30 Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063857 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-notification-agent" containerID="cri-o://81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" gracePeriod=30 Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.063749 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-central-agent" containerID="cri-o://700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" gracePeriod=30 Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.096049 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7830723210000001 podStartE2EDuration="5.096030659s" podCreationTimestamp="2025-10-06 08:39:51 +0000 UTC" firstStartedPulling="2025-10-06 08:39:52.231726235 +0000 UTC m=+1049.061041449" lastFinishedPulling="2025-10-06 08:39:55.544684573 +0000 UTC m=+1052.373999787" observedRunningTime="2025-10-06 08:39:56.092899319 +0000 UTC m=+1052.922214533" watchObservedRunningTime="2025-10-06 08:39:56.096030659 +0000 UTC m=+1052.925345873" Oct 06 08:39:56 crc kubenswrapper[4755]: I1006 08:39:56.838417 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.016694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq7pq\" (UniqueName: \"kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.017069 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.017101 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.017192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.017244 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.017302 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.018193 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd\") pod \"7800ee15-58ef-415b-b34e-ab357d3d0111\" (UID: \"7800ee15-58ef-415b-b34e-ab357d3d0111\") " Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.018775 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.018897 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.028834 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts" (OuterVolumeSpecName: "scripts") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.028957 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq" (OuterVolumeSpecName: "kube-api-access-qq7pq") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "kube-api-access-qq7pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.053826 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084743 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerDied","Data":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084843 4755 scope.go:117] "RemoveContainer" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084861 4755 generic.go:334] "Generic (PLEG): container finished" podID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" exitCode=0 Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084885 4755 generic.go:334] "Generic (PLEG): container finished" podID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" exitCode=2 Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084895 4755 generic.go:334] "Generic (PLEG): container finished" podID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" exitCode=0 Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084907 4755 generic.go:334] "Generic (PLEG): container finished" podID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" exitCode=0 Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.084965 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerDied","Data":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.085548 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerDied","Data":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.085644 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerDied","Data":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.085659 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7800ee15-58ef-415b-b34e-ab357d3d0111","Type":"ContainerDied","Data":"a719856406af91cacd5c3457b5a53fd5723205a6350484432b0b921f96ee47e2"} Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.111350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120274 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120305 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120316 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120332 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120343 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7800ee15-58ef-415b-b34e-ab357d3d0111-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.120354 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq7pq\" (UniqueName: \"kubernetes.io/projected/7800ee15-58ef-415b-b34e-ab357d3d0111-kube-api-access-qq7pq\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.135327 4755 scope.go:117] "RemoveContainer" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.135332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data" (OuterVolumeSpecName: "config-data") pod "7800ee15-58ef-415b-b34e-ab357d3d0111" (UID: "7800ee15-58ef-415b-b34e-ab357d3d0111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.159777 4755 scope.go:117] "RemoveContainer" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.178798 4755 scope.go:117] "RemoveContainer" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.203546 4755 scope.go:117] "RemoveContainer" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.206408 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": container with ID starting with ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396 not found: ID does not exist" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.206456 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} err="failed to get container status \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": rpc error: code = NotFound desc = could not find container \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": container with ID starting with ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.206481 4755 scope.go:117] "RemoveContainer" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.206849 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": container with ID starting with 4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1 not found: ID does not exist" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.206896 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} err="failed to get container status \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": rpc error: code = NotFound desc = could not find container \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": container with ID starting with 4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.206923 4755 scope.go:117] "RemoveContainer" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.207182 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": container with ID starting with 81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c not found: ID does not exist" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207212 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} err="failed to get container status \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": rpc error: code = NotFound desc = could not find container \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": container with ID starting with 81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207229 4755 scope.go:117] "RemoveContainer" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.207444 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": container with ID starting with 700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c not found: ID does not exist" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207467 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} err="failed to get container status \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": rpc error: code = NotFound desc = could not find container \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": container with ID starting with 700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207481 4755 scope.go:117] "RemoveContainer" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207737 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} err="failed to get container status \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": rpc error: code = NotFound desc = could not find container \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": container with ID starting with ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207763 4755 scope.go:117] "RemoveContainer" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.207995 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} err="failed to get container status \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": rpc error: code = NotFound desc = could not find container \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": container with ID starting with 4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208016 4755 scope.go:117] "RemoveContainer" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} err="failed to get container status \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": rpc error: code = NotFound desc = could not find container \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": container with ID starting with 81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208281 4755 scope.go:117] "RemoveContainer" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208514 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} err="failed to get container status \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": rpc error: code = NotFound desc = could not find container \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": container with ID starting with 700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208534 4755 scope.go:117] "RemoveContainer" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208819 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} err="failed to get container status \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": rpc error: code = NotFound desc = could not find container \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": container with ID starting with ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.208852 4755 scope.go:117] "RemoveContainer" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209075 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} err="failed to get container status \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": rpc error: code = NotFound desc = could not find container \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": container with ID starting with 4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209095 4755 scope.go:117] "RemoveContainer" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209295 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} err="failed to get container status \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": rpc error: code = NotFound desc = could not find container \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": container with ID starting with 81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209313 4755 scope.go:117] "RemoveContainer" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209605 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} err="failed to get container status \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": rpc error: code = NotFound desc = could not find container \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": container with ID starting with 700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209632 4755 scope.go:117] "RemoveContainer" containerID="ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209904 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396"} err="failed to get container status \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": rpc error: code = NotFound desc = could not find container \"ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396\": container with ID starting with ee775d2448f61968e920f8c93b459919be3c0b954f9a2a52cf4846d356f7c396 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.209925 4755 scope.go:117] "RemoveContainer" containerID="4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.210173 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1"} err="failed to get container status \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": rpc error: code = NotFound desc = could not find container \"4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1\": container with ID starting with 4ccf2a99b02dd68d041d904bfbe6a1dfa9611f422f137aa10ff97b4d3bd391b1 not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.210192 4755 scope.go:117] "RemoveContainer" containerID="81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.211007 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c"} err="failed to get container status \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": rpc error: code = NotFound desc = could not find container \"81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c\": container with ID starting with 81d984ef8697440cf901927a18753e5e5a8125132274b5be5e597efbf80e669c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.211024 4755 scope.go:117] "RemoveContainer" containerID="700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.211238 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c"} err="failed to get container status \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": rpc error: code = NotFound desc = could not find container \"700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c\": container with ID starting with 700bb4f94e29ae70fa3070d7b831f63a3cebb687da28a98db17450e96723e42c not found: ID does not exist" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.221626 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ee15-58ef-415b-b34e-ab357d3d0111-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.427447 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.433214 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.453643 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454220 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="sg-core" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454293 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="sg-core" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454357 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="proxy-httpd" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454406 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="proxy-httpd" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454479 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cf827c-cb1d-42b1-a5e0-63854c591bdf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454532 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cf827c-cb1d-42b1-a5e0-63854c591bdf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454600 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ec804-1b49-4994-94f3-2051535bb3bf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454660 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ec804-1b49-4994-94f3-2051535bb3bf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454733 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-notification-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454799 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-notification-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.454879 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22360df0-a5e3-45dd-95b7-ddec07373964" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.454932 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="22360df0-a5e3-45dd-95b7-ddec07373964" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: E1006 08:39:57.455018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-central-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455084 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-central-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455350 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-notification-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455445 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="proxy-httpd" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455520 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cf827c-cb1d-42b1-a5e0-63854c591bdf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455605 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ec804-1b49-4994-94f3-2051535bb3bf" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455675 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="ceilometer-central-agent" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455777 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" containerName="sg-core" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.455839 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="22360df0-a5e3-45dd-95b7-ddec07373964" containerName="mariadb-database-create" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.457492 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.460424 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.462111 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.462341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.529910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.529961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhrl\" (UniqueName: \"kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.530036 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.530247 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.530472 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.530641 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.530732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.631854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.631923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.631953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.631975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.632024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.632040 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhrl\" (UniqueName: \"kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.632626 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.633245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.633320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.635452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.636378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.643011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.643413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.650601 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhrl\" (UniqueName: \"kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl\") pod \"ceilometer-0\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.783591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4755]: I1006 08:39:57.898097 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7800ee15-58ef-415b-b34e-ab357d3d0111" path="/var/lib/kubelet/pods/7800ee15-58ef-415b-b34e-ab357d3d0111/volumes" Oct 06 08:39:58 crc kubenswrapper[4755]: W1006 08:39:58.246942 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod559fe2d2_cbf9_43b5_924e_ba7c7d38d03b.slice/crio-fc5377bdbc36325be2910e74908650d4a7a10c16d42b93457c5b76417328ec12 WatchSource:0}: Error finding container fc5377bdbc36325be2910e74908650d4a7a10c16d42b93457c5b76417328ec12: Status 404 returned error can't find the container with id fc5377bdbc36325be2910e74908650d4a7a10c16d42b93457c5b76417328ec12 Oct 06 08:39:58 crc kubenswrapper[4755]: I1006 08:39:58.251094 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:58 crc kubenswrapper[4755]: I1006 08:39:58.360733 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:39:59 crc kubenswrapper[4755]: I1006 08:39:59.105071 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerStarted","Data":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} Oct 06 08:39:59 crc kubenswrapper[4755]: I1006 08:39:59.105543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerStarted","Data":"fc5377bdbc36325be2910e74908650d4a7a10c16d42b93457c5b76417328ec12"} Oct 06 08:39:59 crc kubenswrapper[4755]: I1006 08:39:59.590905 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 08:40:00 crc kubenswrapper[4755]: I1006 08:40:00.123031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerStarted","Data":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} Oct 06 08:40:01 crc kubenswrapper[4755]: I1006 08:40:01.137551 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerStarted","Data":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} Oct 06 08:40:01 crc kubenswrapper[4755]: I1006 08:40:01.442668 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerStarted","Data":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147343 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="sg-core" containerID="cri-o://efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" gracePeriod=30 Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147233 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-central-agent" containerID="cri-o://97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" gracePeriod=30 Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147371 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-notification-agent" containerID="cri-o://b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" gracePeriod=30 Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147283 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="proxy-httpd" containerID="cri-o://f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" gracePeriod=30 Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.147504 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.175957 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.901195901 podStartE2EDuration="5.175929936s" podCreationTimestamp="2025-10-06 08:39:57 +0000 UTC" firstStartedPulling="2025-10-06 08:39:58.250190718 +0000 UTC m=+1055.079505932" lastFinishedPulling="2025-10-06 08:40:01.524924753 +0000 UTC m=+1058.354239967" observedRunningTime="2025-10-06 08:40:02.171388629 +0000 UTC m=+1059.000703843" watchObservedRunningTime="2025-10-06 08:40:02.175929936 +0000 UTC m=+1059.005245160" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.895876 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.957034 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.957377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.957400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhrl\" (UniqueName: \"kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.957826 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.958236 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.958288 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.958354 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.958651 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts\") pod \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\" (UID: \"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b\") " Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.959414 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.960694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.964232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts" (OuterVolumeSpecName: "scripts") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.966663 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl" (OuterVolumeSpecName: "kube-api-access-tfhrl") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "kube-api-access-tfhrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4755]: I1006 08:40:02.992320 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.037200 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.060959 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.060999 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhrl\" (UniqueName: \"kubernetes.io/projected/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-kube-api-access-tfhrl\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.061014 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.061023 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.061032 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.084645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data" (OuterVolumeSpecName: "config-data") pod "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" (UID: "559fe2d2-cbf9-43b5-924e-ba7c7d38d03b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157154 4755 generic.go:334] "Generic (PLEG): container finished" podID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" exitCode=0 Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157192 4755 generic.go:334] "Generic (PLEG): container finished" podID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" exitCode=2 Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157201 4755 generic.go:334] "Generic (PLEG): container finished" podID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" exitCode=0 Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157209 4755 generic.go:334] "Generic (PLEG): container finished" podID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" exitCode=0 Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerDied","Data":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157229 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157262 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerDied","Data":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerDied","Data":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerDied","Data":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"559fe2d2-cbf9-43b5-924e-ba7c7d38d03b","Type":"ContainerDied","Data":"fc5377bdbc36325be2910e74908650d4a7a10c16d42b93457c5b76417328ec12"} Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.157325 4755 scope.go:117] "RemoveContainer" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.162551 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.189740 4755 scope.go:117] "RemoveContainer" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.206783 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.223670 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.227409 4755 scope.go:117] "RemoveContainer" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.233333 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.233956 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-central-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.233978 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-central-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.233999 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="sg-core" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.234723 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="sg-core" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.234766 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="proxy-httpd" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.234775 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="proxy-httpd" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.234795 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-notification-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.234802 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-notification-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.235018 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-central-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.235046 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="ceilometer-notification-agent" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.235056 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="sg-core" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.235075 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" containerName="proxy-httpd" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.238136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.240548 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.242103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.244620 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.268771 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.268846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.268883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.268790 4755 scope.go:117] "RemoveContainer" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.268915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.269064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.269149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.269215 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlms\" (UniqueName: \"kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.292459 4755 scope.go:117] "RemoveContainer" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.293097 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": container with ID starting with f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0 not found: ID does not exist" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.293198 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} err="failed to get container status \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": rpc error: code = NotFound desc = could not find container \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": container with ID starting with f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.293291 4755 scope.go:117] "RemoveContainer" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.293728 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": container with ID starting with efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec not found: ID does not exist" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.293775 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} err="failed to get container status \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": rpc error: code = NotFound desc = could not find container \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": container with ID starting with efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.293808 4755 scope.go:117] "RemoveContainer" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.294145 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": container with ID starting with b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1 not found: ID does not exist" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.294240 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} err="failed to get container status \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": rpc error: code = NotFound desc = could not find container \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": container with ID starting with b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.294323 4755 scope.go:117] "RemoveContainer" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: E1006 08:40:03.294612 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": container with ID starting with 97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4 not found: ID does not exist" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.294727 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} err="failed to get container status \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": rpc error: code = NotFound desc = could not find container \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": container with ID starting with 97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.294805 4755 scope.go:117] "RemoveContainer" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.295132 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} err="failed to get container status \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": rpc error: code = NotFound desc = could not find container \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": container with ID starting with f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.295239 4755 scope.go:117] "RemoveContainer" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.295516 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} err="failed to get container status \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": rpc error: code = NotFound desc = could not find container \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": container with ID starting with efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.295640 4755 scope.go:117] "RemoveContainer" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.295939 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} err="failed to get container status \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": rpc error: code = NotFound desc = could not find container \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": container with ID starting with b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.296043 4755 scope.go:117] "RemoveContainer" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.296319 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} err="failed to get container status \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": rpc error: code = NotFound desc = could not find container \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": container with ID starting with 97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.296346 4755 scope.go:117] "RemoveContainer" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.296631 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} err="failed to get container status \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": rpc error: code = NotFound desc = could not find container \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": container with ID starting with f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.296733 4755 scope.go:117] "RemoveContainer" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297078 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} err="failed to get container status \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": rpc error: code = NotFound desc = could not find container \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": container with ID starting with efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297103 4755 scope.go:117] "RemoveContainer" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297365 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} err="failed to get container status \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": rpc error: code = NotFound desc = could not find container \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": container with ID starting with b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297445 4755 scope.go:117] "RemoveContainer" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297746 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} err="failed to get container status \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": rpc error: code = NotFound desc = could not find container \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": container with ID starting with 97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297764 4755 scope.go:117] "RemoveContainer" containerID="f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297975 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0"} err="failed to get container status \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": rpc error: code = NotFound desc = could not find container \"f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0\": container with ID starting with f91cf44d45d08ebd001b872839c879db620402d5308145c63db1948e64180ff0 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.297992 4755 scope.go:117] "RemoveContainer" containerID="efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.298248 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec"} err="failed to get container status \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": rpc error: code = NotFound desc = could not find container \"efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec\": container with ID starting with efccecfd72b1c3c7cbfe5daa7091c96ee2e7b7dac707dffcf247c029a5d4cfec not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.298324 4755 scope.go:117] "RemoveContainer" containerID="b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.298575 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1"} err="failed to get container status \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": rpc error: code = NotFound desc = could not find container \"b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1\": container with ID starting with b6eec2929e508b8a4db2fd746badfee74913c686848f2283ada5d0743e9177c1 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.298594 4755 scope.go:117] "RemoveContainer" containerID="97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.298856 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4"} err="failed to get container status \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": rpc error: code = NotFound desc = could not find container \"97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4\": container with ID starting with 97e9b567515ac657413363de56d7c0df97f2c186293f38321c5520f474d56db4 not found: ID does not exist" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.370976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371249 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371359 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371438 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.371764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlms\" (UniqueName: \"kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.372407 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.373055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.375621 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.375637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.375750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.376698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.393354 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlms\" (UniqueName: \"kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms\") pod \"ceilometer-0\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.579978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:03 crc kubenswrapper[4755]: I1006 08:40:03.888790 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559fe2d2-cbf9-43b5-924e-ba7c7d38d03b" path="/var/lib/kubelet/pods/559fe2d2-cbf9-43b5-924e-ba7c7d38d03b/volumes" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.005324 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:04 crc kubenswrapper[4755]: W1006 08:40:04.010885 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35b609c4_d03c_4cdf_941f_99913b969b0f.slice/crio-7ac60a0d7819e95723b719cec51f734e34ea7c48836d7bfbef11600c500854b3 WatchSource:0}: Error finding container 7ac60a0d7819e95723b719cec51f734e34ea7c48836d7bfbef11600c500854b3: Status 404 returned error can't find the container with id 7ac60a0d7819e95723b719cec51f734e34ea7c48836d7bfbef11600c500854b3 Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.165861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerStarted","Data":"7ac60a0d7819e95723b719cec51f734e34ea7c48836d7bfbef11600c500854b3"} Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.342441 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1f63-account-create-59mh2"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.344725 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.346615 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.352716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1f63-account-create-59mh2"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.389223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbnc\" (UniqueName: \"kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc\") pod \"nova-api-1f63-account-create-59mh2\" (UID: \"74cae32a-c0bb-4798-9466-198d0da08a4c\") " pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.490326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbnc\" (UniqueName: \"kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc\") pod \"nova-api-1f63-account-create-59mh2\" (UID: \"74cae32a-c0bb-4798-9466-198d0da08a4c\") " pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.521444 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbnc\" (UniqueName: \"kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc\") pod \"nova-api-1f63-account-create-59mh2\" (UID: \"74cae32a-c0bb-4798-9466-198d0da08a4c\") " pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.546220 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a0db-account-create-z75g9"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.547234 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.550406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.554768 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a0db-account-create-z75g9"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.592200 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7ng\" (UniqueName: \"kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng\") pod \"nova-cell0-a0db-account-create-z75g9\" (UID: \"d20dd398-8259-4013-b75a-ef645050819e\") " pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.660910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.671306 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6854f5796f-f7f5s" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.697294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7ng\" (UniqueName: \"kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng\") pod \"nova-cell0-a0db-account-create-z75g9\" (UID: \"d20dd398-8259-4013-b75a-ef645050819e\") " pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.718318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7ng\" (UniqueName: \"kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng\") pod \"nova-cell0-a0db-account-create-z75g9\" (UID: \"d20dd398-8259-4013-b75a-ef645050819e\") " pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.764385 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.764689 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74bd7fb97b-tzfvn" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-api" containerID="cri-o://4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149" gracePeriod=30 Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.765234 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74bd7fb97b-tzfvn" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-httpd" containerID="cri-o://f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b" gracePeriod=30 Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.819801 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d6ff-account-create-zx2lm"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.825352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.846651 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.868019 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d6ff-account-create-zx2lm"] Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.915071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:04 crc kubenswrapper[4755]: I1006 08:40:04.931491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknr7\" (UniqueName: \"kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7\") pod \"nova-cell1-d6ff-account-create-zx2lm\" (UID: \"ab35ed23-84a2-4096-abf1-43a71d39e29b\") " pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.047291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknr7\" (UniqueName: \"kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7\") pod \"nova-cell1-d6ff-account-create-zx2lm\" (UID: \"ab35ed23-84a2-4096-abf1-43a71d39e29b\") " pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.080033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknr7\" (UniqueName: \"kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7\") pod \"nova-cell1-d6ff-account-create-zx2lm\" (UID: \"ab35ed23-84a2-4096-abf1-43a71d39e29b\") " pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.174833 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.187003 4755 generic.go:334] "Generic (PLEG): container finished" podID="794ddd23-a887-4125-a7de-c9281188c8ea" containerID="f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b" exitCode=0 Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.187086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerDied","Data":"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b"} Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.188581 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerStarted","Data":"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5"} Oct 06 08:40:05 crc kubenswrapper[4755]: W1006 08:40:05.324784 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74cae32a_c0bb_4798_9466_198d0da08a4c.slice/crio-ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372 WatchSource:0}: Error finding container ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372: Status 404 returned error can't find the container with id ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372 Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.327878 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1f63-account-create-59mh2"] Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.523011 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a0db-account-create-z75g9"] Oct 06 08:40:05 crc kubenswrapper[4755]: I1006 08:40:05.701361 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d6ff-account-create-zx2lm"] Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.198986 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerStarted","Data":"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.201520 4755 generic.go:334] "Generic (PLEG): container finished" podID="d20dd398-8259-4013-b75a-ef645050819e" containerID="ce014ca69cc1ed77df7a3dfb5687b4719459a042d347dd31a02436c239aa6d4d" exitCode=0 Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.201618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a0db-account-create-z75g9" event={"ID":"d20dd398-8259-4013-b75a-ef645050819e","Type":"ContainerDied","Data":"ce014ca69cc1ed77df7a3dfb5687b4719459a042d347dd31a02436c239aa6d4d"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.201658 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a0db-account-create-z75g9" event={"ID":"d20dd398-8259-4013-b75a-ef645050819e","Type":"ContainerStarted","Data":"89fa2347321c85643836ba44a0c953205f09a2054dc48f0ba0a822c59fb80ab5"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.203334 4755 generic.go:334] "Generic (PLEG): container finished" podID="74cae32a-c0bb-4798-9466-198d0da08a4c" containerID="a6a9cb6921364042b43b9ea86bf6a31a53c6e7596235570602403e72e7522ba7" exitCode=0 Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.203385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f63-account-create-59mh2" event={"ID":"74cae32a-c0bb-4798-9466-198d0da08a4c","Type":"ContainerDied","Data":"a6a9cb6921364042b43b9ea86bf6a31a53c6e7596235570602403e72e7522ba7"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.203407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f63-account-create-59mh2" event={"ID":"74cae32a-c0bb-4798-9466-198d0da08a4c","Type":"ContainerStarted","Data":"ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.204679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" event={"ID":"ab35ed23-84a2-4096-abf1-43a71d39e29b","Type":"ContainerStarted","Data":"718bf2facd3e8128dd53b5f599f0d82a100fc801e190e6c3bbf69a779e6bc625"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.204706 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" event={"ID":"ab35ed23-84a2-4096-abf1-43a71d39e29b","Type":"ContainerStarted","Data":"046c21f31d4d58f0b3f4390d6b77352e13d2b0b6dce896620c689d4172f4cbb4"} Oct 06 08:40:06 crc kubenswrapper[4755]: I1006 08:40:06.245869 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" podStartSLOduration=2.245847683 podStartE2EDuration="2.245847683s" podCreationTimestamp="2025-10-06 08:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:06.242639491 +0000 UTC m=+1063.071954705" watchObservedRunningTime="2025-10-06 08:40:06.245847683 +0000 UTC m=+1063.075162887" Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.214722 4755 generic.go:334] "Generic (PLEG): container finished" podID="ab35ed23-84a2-4096-abf1-43a71d39e29b" containerID="718bf2facd3e8128dd53b5f599f0d82a100fc801e190e6c3bbf69a779e6bc625" exitCode=0 Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.214764 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" event={"ID":"ab35ed23-84a2-4096-abf1-43a71d39e29b","Type":"ContainerDied","Data":"718bf2facd3e8128dd53b5f599f0d82a100fc801e190e6c3bbf69a779e6bc625"} Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.218414 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerStarted","Data":"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7"} Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.618060 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.698193 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwbnc\" (UniqueName: \"kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc\") pod \"74cae32a-c0bb-4798-9466-198d0da08a4c\" (UID: \"74cae32a-c0bb-4798-9466-198d0da08a4c\") " Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.706771 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc" (OuterVolumeSpecName: "kube-api-access-jwbnc") pod "74cae32a-c0bb-4798-9466-198d0da08a4c" (UID: "74cae32a-c0bb-4798-9466-198d0da08a4c"). InnerVolumeSpecName "kube-api-access-jwbnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.800538 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwbnc\" (UniqueName: \"kubernetes.io/projected/74cae32a-c0bb-4798-9466-198d0da08a4c-kube-api-access-jwbnc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:07 crc kubenswrapper[4755]: I1006 08:40:07.886075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.003424 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj7ng\" (UniqueName: \"kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng\") pod \"d20dd398-8259-4013-b75a-ef645050819e\" (UID: \"d20dd398-8259-4013-b75a-ef645050819e\") " Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.008292 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng" (OuterVolumeSpecName: "kube-api-access-pj7ng") pod "d20dd398-8259-4013-b75a-ef645050819e" (UID: "d20dd398-8259-4013-b75a-ef645050819e"). InnerVolumeSpecName "kube-api-access-pj7ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.105834 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj7ng\" (UniqueName: \"kubernetes.io/projected/d20dd398-8259-4013-b75a-ef645050819e-kube-api-access-pj7ng\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.234057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerStarted","Data":"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df"} Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.234462 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.239159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a0db-account-create-z75g9" event={"ID":"d20dd398-8259-4013-b75a-ef645050819e","Type":"ContainerDied","Data":"89fa2347321c85643836ba44a0c953205f09a2054dc48f0ba0a822c59fb80ab5"} Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.239482 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89fa2347321c85643836ba44a0c953205f09a2054dc48f0ba0a822c59fb80ab5" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.239757 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a0db-account-create-z75g9" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.242908 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f63-account-create-59mh2" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.244213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f63-account-create-59mh2" event={"ID":"74cae32a-c0bb-4798-9466-198d0da08a4c","Type":"ContainerDied","Data":"ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372"} Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.244251 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea337401a87e7a6888072c53797637b1e44eaf3ddef63c8c497813f7e310a372" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.278261 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.561118074 podStartE2EDuration="5.278237131s" podCreationTimestamp="2025-10-06 08:40:03 +0000 UTC" firstStartedPulling="2025-10-06 08:40:04.013130655 +0000 UTC m=+1060.842445869" lastFinishedPulling="2025-10-06 08:40:07.730249712 +0000 UTC m=+1064.559564926" observedRunningTime="2025-10-06 08:40:08.256884678 +0000 UTC m=+1065.086199912" watchObservedRunningTime="2025-10-06 08:40:08.278237131 +0000 UTC m=+1065.107552345" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.493101 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.616477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknr7\" (UniqueName: \"kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7\") pod \"ab35ed23-84a2-4096-abf1-43a71d39e29b\" (UID: \"ab35ed23-84a2-4096-abf1-43a71d39e29b\") " Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.620510 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7" (OuterVolumeSpecName: "kube-api-access-dknr7") pod "ab35ed23-84a2-4096-abf1-43a71d39e29b" (UID: "ab35ed23-84a2-4096-abf1-43a71d39e29b"). InnerVolumeSpecName "kube-api-access-dknr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:08 crc kubenswrapper[4755]: I1006 08:40:08.719064 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dknr7\" (UniqueName: \"kubernetes.io/projected/ab35ed23-84a2-4096-abf1-43a71d39e29b-kube-api-access-dknr7\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.254923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" event={"ID":"ab35ed23-84a2-4096-abf1-43a71d39e29b","Type":"ContainerDied","Data":"046c21f31d4d58f0b3f4390d6b77352e13d2b0b6dce896620c689d4172f4cbb4"} Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.255306 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046c21f31d4d58f0b3f4390d6b77352e13d2b0b6dce896620c689d4172f4cbb4" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.254983 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d6ff-account-create-zx2lm" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.774420 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lq2rs"] Oct 06 08:40:09 crc kubenswrapper[4755]: E1006 08:40:09.774851 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cae32a-c0bb-4798-9466-198d0da08a4c" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.774874 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cae32a-c0bb-4798-9466-198d0da08a4c" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: E1006 08:40:09.774897 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20dd398-8259-4013-b75a-ef645050819e" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.774904 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20dd398-8259-4013-b75a-ef645050819e" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: E1006 08:40:09.774918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab35ed23-84a2-4096-abf1-43a71d39e29b" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.774926 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab35ed23-84a2-4096-abf1-43a71d39e29b" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.775122 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cae32a-c0bb-4798-9466-198d0da08a4c" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.775148 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20dd398-8259-4013-b75a-ef645050819e" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.775180 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab35ed23-84a2-4096-abf1-43a71d39e29b" containerName="mariadb-account-create" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.775931 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.779683 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.779924 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9js6v" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.780184 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.797722 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lq2rs"] Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.837588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.837645 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.837807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-845bk\" (UniqueName: \"kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.837850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.939487 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-845bk\" (UniqueName: \"kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.939547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.939636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.939662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.945203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.945223 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.949479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:09 crc kubenswrapper[4755]: I1006 08:40:09.968232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-845bk\" (UniqueName: \"kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk\") pod \"nova-cell0-conductor-db-sync-lq2rs\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:10 crc kubenswrapper[4755]: I1006 08:40:10.093096 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:10 crc kubenswrapper[4755]: E1006 08:40:10.619879 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794ddd23_a887_4125_a7de_c9281188c8ea.slice/crio-4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:40:10 crc kubenswrapper[4755]: I1006 08:40:10.659181 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lq2rs"] Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.137953 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.270652 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs\") pod \"794ddd23-a887-4125-a7de-c9281188c8ea\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.271852 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config\") pod \"794ddd23-a887-4125-a7de-c9281188c8ea\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.272021 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrfhk\" (UniqueName: \"kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk\") pod \"794ddd23-a887-4125-a7de-c9281188c8ea\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.272248 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config\") pod \"794ddd23-a887-4125-a7de-c9281188c8ea\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.272360 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle\") pod \"794ddd23-a887-4125-a7de-c9281188c8ea\" (UID: \"794ddd23-a887-4125-a7de-c9281188c8ea\") " Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.276764 4755 generic.go:334] "Generic (PLEG): container finished" podID="794ddd23-a887-4125-a7de-c9281188c8ea" containerID="4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149" exitCode=0 Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.276810 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74bd7fb97b-tzfvn" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.277002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerDied","Data":"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149"} Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.277522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74bd7fb97b-tzfvn" event={"ID":"794ddd23-a887-4125-a7de-c9281188c8ea","Type":"ContainerDied","Data":"505cfc49ed0cb84bd48121fd4cb9875b38260b745c062f47afbdd9eb9d53703d"} Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.277613 4755 scope.go:117] "RemoveContainer" containerID="f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.280511 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" event={"ID":"1730006e-7d6b-47eb-a254-e04ba1f1a44e","Type":"ContainerStarted","Data":"7c8867cd7a908239d1a06ac337dfad7c57e4632bca89196851fa2e08a9b96bf5"} Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.290378 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "794ddd23-a887-4125-a7de-c9281188c8ea" (UID: "794ddd23-a887-4125-a7de-c9281188c8ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.290468 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk" (OuterVolumeSpecName: "kube-api-access-hrfhk") pod "794ddd23-a887-4125-a7de-c9281188c8ea" (UID: "794ddd23-a887-4125-a7de-c9281188c8ea"). InnerVolumeSpecName "kube-api-access-hrfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.317349 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config" (OuterVolumeSpecName: "config") pod "794ddd23-a887-4125-a7de-c9281188c8ea" (UID: "794ddd23-a887-4125-a7de-c9281188c8ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.338031 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794ddd23-a887-4125-a7de-c9281188c8ea" (UID: "794ddd23-a887-4125-a7de-c9281188c8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.349201 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "794ddd23-a887-4125-a7de-c9281188c8ea" (UID: "794ddd23-a887-4125-a7de-c9281188c8ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.374803 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrfhk\" (UniqueName: \"kubernetes.io/projected/794ddd23-a887-4125-a7de-c9281188c8ea-kube-api-access-hrfhk\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.374976 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.375044 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.375125 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.375195 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794ddd23-a887-4125-a7de-c9281188c8ea-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.389476 4755 scope.go:117] "RemoveContainer" containerID="4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.407335 4755 scope.go:117] "RemoveContainer" containerID="f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b" Oct 06 08:40:11 crc kubenswrapper[4755]: E1006 08:40:11.407693 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b\": container with ID starting with f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b not found: ID does not exist" containerID="f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.407721 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b"} err="failed to get container status \"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b\": rpc error: code = NotFound desc = could not find container \"f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b\": container with ID starting with f533d98a7717478ebf20dff8ae1eac673afbd01ca2011a47d26517f30346fb3b not found: ID does not exist" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.407742 4755 scope.go:117] "RemoveContainer" containerID="4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149" Oct 06 08:40:11 crc kubenswrapper[4755]: E1006 08:40:11.408187 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149\": container with ID starting with 4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149 not found: ID does not exist" containerID="4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.408213 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149"} err="failed to get container status \"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149\": rpc error: code = NotFound desc = could not find container \"4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149\": container with ID starting with 4a119ea2595681e111bb993455186bebb96eb08164f75a9b18f355c61e573149 not found: ID does not exist" Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.608456 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.615178 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-74bd7fb97b-tzfvn"] Oct 06 08:40:11 crc kubenswrapper[4755]: I1006 08:40:11.892380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" path="/var/lib/kubelet/pods/794ddd23-a887-4125-a7de-c9281188c8ea/volumes" Oct 06 08:40:17 crc kubenswrapper[4755]: I1006 08:40:17.336693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" event={"ID":"1730006e-7d6b-47eb-a254-e04ba1f1a44e","Type":"ContainerStarted","Data":"8b136dda21744b14dcde7049379ef128262393a590b1c50a30730d9117bf1c0d"} Oct 06 08:40:17 crc kubenswrapper[4755]: I1006 08:40:17.352982 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" podStartSLOduration=1.934990904 podStartE2EDuration="8.352963746s" podCreationTimestamp="2025-10-06 08:40:09 +0000 UTC" firstStartedPulling="2025-10-06 08:40:10.698152555 +0000 UTC m=+1067.527467769" lastFinishedPulling="2025-10-06 08:40:17.116125397 +0000 UTC m=+1073.945440611" observedRunningTime="2025-10-06 08:40:17.348683447 +0000 UTC m=+1074.177998661" watchObservedRunningTime="2025-10-06 08:40:17.352963746 +0000 UTC m=+1074.182278960" Oct 06 08:40:28 crc kubenswrapper[4755]: I1006 08:40:28.439604 4755 generic.go:334] "Generic (PLEG): container finished" podID="1730006e-7d6b-47eb-a254-e04ba1f1a44e" containerID="8b136dda21744b14dcde7049379ef128262393a590b1c50a30730d9117bf1c0d" exitCode=0 Oct 06 08:40:28 crc kubenswrapper[4755]: I1006 08:40:28.439683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" event={"ID":"1730006e-7d6b-47eb-a254-e04ba1f1a44e","Type":"ContainerDied","Data":"8b136dda21744b14dcde7049379ef128262393a590b1c50a30730d9117bf1c0d"} Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.846629 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.891690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle\") pod \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.891813 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data\") pod \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.891888 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts\") pod \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.891927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-845bk\" (UniqueName: \"kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk\") pod \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\" (UID: \"1730006e-7d6b-47eb-a254-e04ba1f1a44e\") " Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.897709 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk" (OuterVolumeSpecName: "kube-api-access-845bk") pod "1730006e-7d6b-47eb-a254-e04ba1f1a44e" (UID: "1730006e-7d6b-47eb-a254-e04ba1f1a44e"). InnerVolumeSpecName "kube-api-access-845bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.898551 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts" (OuterVolumeSpecName: "scripts") pod "1730006e-7d6b-47eb-a254-e04ba1f1a44e" (UID: "1730006e-7d6b-47eb-a254-e04ba1f1a44e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.914873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1730006e-7d6b-47eb-a254-e04ba1f1a44e" (UID: "1730006e-7d6b-47eb-a254-e04ba1f1a44e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.928970 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data" (OuterVolumeSpecName: "config-data") pod "1730006e-7d6b-47eb-a254-e04ba1f1a44e" (UID: "1730006e-7d6b-47eb-a254-e04ba1f1a44e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.993241 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.993275 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.993287 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-845bk\" (UniqueName: \"kubernetes.io/projected/1730006e-7d6b-47eb-a254-e04ba1f1a44e-kube-api-access-845bk\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:29 crc kubenswrapper[4755]: I1006 08:40:29.993299 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1730006e-7d6b-47eb-a254-e04ba1f1a44e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.459581 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" event={"ID":"1730006e-7d6b-47eb-a254-e04ba1f1a44e","Type":"ContainerDied","Data":"7c8867cd7a908239d1a06ac337dfad7c57e4632bca89196851fa2e08a9b96bf5"} Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.459843 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c8867cd7a908239d1a06ac337dfad7c57e4632bca89196851fa2e08a9b96bf5" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.459645 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lq2rs" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.615998 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:30 crc kubenswrapper[4755]: E1006 08:40:30.616372 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-httpd" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616388 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-httpd" Oct 06 08:40:30 crc kubenswrapper[4755]: E1006 08:40:30.616405 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1730006e-7d6b-47eb-a254-e04ba1f1a44e" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616411 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1730006e-7d6b-47eb-a254-e04ba1f1a44e" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:30 crc kubenswrapper[4755]: E1006 08:40:30.616440 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-api" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616447 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-api" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616602 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-httpd" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616615 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ddd23-a887-4125-a7de-c9281188c8ea" containerName="neutron-api" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.616655 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1730006e-7d6b-47eb-a254-e04ba1f1a44e" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.617223 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.619422 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.619722 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9js6v" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.633978 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.704266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.704394 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp826\" (UniqueName: \"kubernetes.io/projected/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-kube-api-access-dp826\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.704461 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.805844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.805939 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.806015 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp826\" (UniqueName: \"kubernetes.io/projected/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-kube-api-access-dp826\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.810435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.819069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.822049 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp826\" (UniqueName: \"kubernetes.io/projected/68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c-kube-api-access-dp826\") pod \"nova-cell0-conductor-0\" (UID: \"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:30 crc kubenswrapper[4755]: I1006 08:40:30.943163 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:31 crc kubenswrapper[4755]: I1006 08:40:31.425787 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:31 crc kubenswrapper[4755]: W1006 08:40:31.435932 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68ef8a3b_ac54_4a60_bdf6_fc5d5d3fb66c.slice/crio-7684b29a1c4404c201afc033e65abe33ef9ce8ab8a535380f1ed70052bb8c3b0 WatchSource:0}: Error finding container 7684b29a1c4404c201afc033e65abe33ef9ce8ab8a535380f1ed70052bb8c3b0: Status 404 returned error can't find the container with id 7684b29a1c4404c201afc033e65abe33ef9ce8ab8a535380f1ed70052bb8c3b0 Oct 06 08:40:31 crc kubenswrapper[4755]: I1006 08:40:31.470001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c","Type":"ContainerStarted","Data":"7684b29a1c4404c201afc033e65abe33ef9ce8ab8a535380f1ed70052bb8c3b0"} Oct 06 08:40:32 crc kubenswrapper[4755]: I1006 08:40:32.480801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c","Type":"ContainerStarted","Data":"a90b5eb143da4a95705b06521d8dd3e5622880877409d09711c311af4584e639"} Oct 06 08:40:32 crc kubenswrapper[4755]: I1006 08:40:32.481124 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:32 crc kubenswrapper[4755]: I1006 08:40:32.496227 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.496206568 podStartE2EDuration="2.496206568s" podCreationTimestamp="2025-10-06 08:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:32.494138325 +0000 UTC m=+1089.323453559" watchObservedRunningTime="2025-10-06 08:40:32.496206568 +0000 UTC m=+1089.325521782" Oct 06 08:40:33 crc kubenswrapper[4755]: I1006 08:40:33.586620 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4755]: I1006 08:40:35.877391 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:35 crc kubenswrapper[4755]: I1006 08:40:35.877936 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1fd020a3-7f41-424d-acd4-0e06764fafb3" containerName="kube-state-metrics" containerID="cri-o://e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5" gracePeriod=30 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.325513 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.426071 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v8vb\" (UniqueName: \"kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb\") pod \"1fd020a3-7f41-424d-acd4-0e06764fafb3\" (UID: \"1fd020a3-7f41-424d-acd4-0e06764fafb3\") " Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.431997 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb" (OuterVolumeSpecName: "kube-api-access-7v8vb") pod "1fd020a3-7f41-424d-acd4-0e06764fafb3" (UID: "1fd020a3-7f41-424d-acd4-0e06764fafb3"). InnerVolumeSpecName "kube-api-access-7v8vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.511741 4755 generic.go:334] "Generic (PLEG): container finished" podID="1fd020a3-7f41-424d-acd4-0e06764fafb3" containerID="e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5" exitCode=2 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.511793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd020a3-7f41-424d-acd4-0e06764fafb3","Type":"ContainerDied","Data":"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5"} Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.511800 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.511824 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd020a3-7f41-424d-acd4-0e06764fafb3","Type":"ContainerDied","Data":"0225053ecdac0fdae29f01f3f868a5de82154857aac4be18069e0eac94d04d22"} Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.511849 4755 scope.go:117] "RemoveContainer" containerID="e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.527778 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v8vb\" (UniqueName: \"kubernetes.io/projected/1fd020a3-7f41-424d-acd4-0e06764fafb3-kube-api-access-7v8vb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.544189 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.545656 4755 scope.go:117] "RemoveContainer" containerID="e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5" Oct 06 08:40:36 crc kubenswrapper[4755]: E1006 08:40:36.546187 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5\": container with ID starting with e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5 not found: ID does not exist" containerID="e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.546224 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5"} err="failed to get container status \"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5\": rpc error: code = NotFound desc = could not find container \"e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5\": container with ID starting with e6fc338c96cb9fe6db14d7ba12328ddd08f9a86bba6f238181c42e11b173bec5 not found: ID does not exist" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.557150 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.568316 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:36 crc kubenswrapper[4755]: E1006 08:40:36.568746 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd020a3-7f41-424d-acd4-0e06764fafb3" containerName="kube-state-metrics" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.568763 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd020a3-7f41-424d-acd4-0e06764fafb3" containerName="kube-state-metrics" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.568947 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd020a3-7f41-424d-acd4-0e06764fafb3" containerName="kube-state-metrics" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.569538 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.578864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.579033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.583467 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.628924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wr7f\" (UniqueName: \"kubernetes.io/projected/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-api-access-5wr7f\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.629001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.629055 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.629167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.731235 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.731338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wr7f\" (UniqueName: \"kubernetes.io/projected/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-api-access-5wr7f\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.731389 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.731476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.735625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.736106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.742778 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecac61c8-6fb8-4eac-b34b-2589131fbeca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.751515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wr7f\" (UniqueName: \"kubernetes.io/projected/ecac61c8-6fb8-4eac-b34b-2589131fbeca-kube-api-access-5wr7f\") pod \"kube-state-metrics-0\" (UID: \"ecac61c8-6fb8-4eac-b34b-2589131fbeca\") " pod="openstack/kube-state-metrics-0" Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.824744 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.825080 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="proxy-httpd" containerID="cri-o://97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df" gracePeriod=30 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.825112 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="sg-core" containerID="cri-o://e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7" gracePeriod=30 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.825180 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-notification-agent" containerID="cri-o://077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e" gracePeriod=30 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.825211 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-central-agent" containerID="cri-o://f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5" gracePeriod=30 Oct 06 08:40:36 crc kubenswrapper[4755]: I1006 08:40:36.895720 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.370502 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:40:37 crc kubenswrapper[4755]: W1006 08:40:37.380172 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecac61c8_6fb8_4eac_b34b_2589131fbeca.slice/crio-ae0a69d64b0c77bf4261090c9d0a1489e3e5164e44395f20943a01dcaed5dc99 WatchSource:0}: Error finding container ae0a69d64b0c77bf4261090c9d0a1489e3e5164e44395f20943a01dcaed5dc99: Status 404 returned error can't find the container with id ae0a69d64b0c77bf4261090c9d0a1489e3e5164e44395f20943a01dcaed5dc99 Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.521538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ecac61c8-6fb8-4eac-b34b-2589131fbeca","Type":"ContainerStarted","Data":"ae0a69d64b0c77bf4261090c9d0a1489e3e5164e44395f20943a01dcaed5dc99"} Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524318 4755 generic.go:334] "Generic (PLEG): container finished" podID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerID="97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df" exitCode=0 Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524346 4755 generic.go:334] "Generic (PLEG): container finished" podID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerID="e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7" exitCode=2 Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524354 4755 generic.go:334] "Generic (PLEG): container finished" podID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerID="f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5" exitCode=0 Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerDied","Data":"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df"} Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerDied","Data":"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7"} Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.524424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerDied","Data":"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5"} Oct 06 08:40:37 crc kubenswrapper[4755]: I1006 08:40:37.931983 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd020a3-7f41-424d-acd4-0e06764fafb3" path="/var/lib/kubelet/pods/1fd020a3-7f41-424d-acd4-0e06764fafb3/volumes" Oct 06 08:40:38 crc kubenswrapper[4755]: I1006 08:40:38.534254 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ecac61c8-6fb8-4eac-b34b-2589131fbeca","Type":"ContainerStarted","Data":"190a8b4b74bb08ba6d6bcd23eae3c739c3acad746787ad88e234110f6fe2089b"} Oct 06 08:40:38 crc kubenswrapper[4755]: I1006 08:40:38.534414 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 08:40:38 crc kubenswrapper[4755]: I1006 08:40:38.560084 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.186397923 podStartE2EDuration="2.560061035s" podCreationTimestamp="2025-10-06 08:40:36 +0000 UTC" firstStartedPulling="2025-10-06 08:40:37.382462387 +0000 UTC m=+1094.211777601" lastFinishedPulling="2025-10-06 08:40:37.756125499 +0000 UTC m=+1094.585440713" observedRunningTime="2025-10-06 08:40:38.550171563 +0000 UTC m=+1095.379486767" watchObservedRunningTime="2025-10-06 08:40:38.560061035 +0000 UTC m=+1095.389376269" Oct 06 08:40:40 crc kubenswrapper[4755]: I1006 08:40:40.975475 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.422705 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wfkm6"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.425871 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.456939 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfkm6"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.458900 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.458930 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.522165 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.522682 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hfw\" (UniqueName: \"kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.523025 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.523187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.625442 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.625489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.625554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.625641 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hfw\" (UniqueName: \"kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.628050 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.629554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.633139 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.636401 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.641211 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.645695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.646072 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.646813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hfw\" (UniqueName: \"kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.658933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.661159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts\") pod \"nova-cell0-cell-mapping-wfkm6\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.668149 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.699499 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.727767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jrx\" (UniqueName: \"kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.728057 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.728145 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.728275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.728371 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snpcj\" (UniqueName: \"kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.728482 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.742516 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.744120 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.751235 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.755838 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.787262 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.789853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.791781 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.830190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.833131 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.833283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.841729 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842493 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bjr\" (UniqueName: \"kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snpcj\" (UniqueName: \"kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842642 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842770 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jrx\" (UniqueName: \"kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.842885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fb9\" (UniqueName: \"kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.858748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.863338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.867894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jrx\" (UniqueName: \"kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.868256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.868748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.869726 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snpcj\" (UniqueName: \"kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj\") pod \"nova-scheduler-0\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.900954 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.950952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bjr\" (UniqueName: \"kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.951290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fb9\" (UniqueName: \"kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.952079 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.957895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.960638 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.966142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.967636 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.970854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.971886 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.978784 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bjr\" (UniqueName: \"kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr\") pod \"nova-api-0\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " pod="openstack/nova-api-0" Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.980370 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:40:41 crc kubenswrapper[4755]: I1006 08:40:41.987381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fb9\" (UniqueName: \"kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.001707 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " pod="openstack/nova-metadata-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.053742 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.053797 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvc8\" (UniqueName: \"kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.055960 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.056099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.056180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.067445 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.086004 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.128888 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.142577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.158210 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.158246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvc8\" (UniqueName: \"kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.158327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.158374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.158401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.159165 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.159239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.159905 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.160107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.175424 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvc8\" (UniqueName: \"kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8\") pod \"dnsmasq-dns-566b5b7845-rm2ld\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.341924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.429667 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfkm6"] Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.561481 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.585426 4755 generic.go:334] "Generic (PLEG): container finished" podID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerID="077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e" exitCode=0 Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.585666 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.587543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerDied","Data":"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e"} Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.587614 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b609c4-d03c-4cdf-941f-99913b969b0f","Type":"ContainerDied","Data":"7ac60a0d7819e95723b719cec51f734e34ea7c48836d7bfbef11600c500854b3"} Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.587636 4755 scope.go:117] "RemoveContainer" containerID="97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.600131 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfkm6" event={"ID":"ce67a97c-6bfd-4684-be25-c82eec5f8237","Type":"ContainerStarted","Data":"5cd9a3abe907b573a0817da845b5f15e9dc6f22eb2d7e14a0b9dcfc732e2c317"} Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.618472 4755 scope.go:117] "RemoveContainer" containerID="e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.655035 4755 scope.go:117] "RemoveContainer" containerID="077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672479 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twlms\" (UniqueName: \"kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672616 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672766 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.672950 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle\") pod \"35b609c4-d03c-4cdf-941f-99913b969b0f\" (UID: \"35b609c4-d03c-4cdf-941f-99913b969b0f\") " Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.673678 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.673921 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.677781 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms" (OuterVolumeSpecName: "kube-api-access-twlms") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "kube-api-access-twlms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.678665 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts" (OuterVolumeSpecName: "scripts") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.693959 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f45x"] Oct 06 08:40:42 crc kubenswrapper[4755]: E1006 08:40:42.694521 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="proxy-httpd" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.694547 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="proxy-httpd" Oct 06 08:40:42 crc kubenswrapper[4755]: E1006 08:40:42.694608 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="sg-core" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.694619 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="sg-core" Oct 06 08:40:42 crc kubenswrapper[4755]: E1006 08:40:42.694641 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-central-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.694650 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-central-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: E1006 08:40:42.694667 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-notification-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.694675 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-notification-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.695881 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-central-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.695942 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="proxy-httpd" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.695953 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="ceilometer-notification-agent" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.695965 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" containerName="sg-core" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.696795 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.700676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.701001 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.719694 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f45x"] Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.732855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.790099 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.794935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.794991 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44tc\" (UniqueName: \"kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795228 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795276 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795286 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twlms\" (UniqueName: \"kubernetes.io/projected/35b609c4-d03c-4cdf-941f-99913b969b0f-kube-api-access-twlms\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795297 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795306 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.795313 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b609c4-d03c-4cdf-941f-99913b969b0f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.847496 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.897700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.897953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.898187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.898299 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b44tc\" (UniqueName: \"kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.898512 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.904385 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.905100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.913954 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.921157 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44tc\" (UniqueName: \"kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc\") pod \"nova-cell1-conductor-db-sync-8f45x\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.923903 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.933944 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data" (OuterVolumeSpecName: "config-data") pod "35b609c4-d03c-4cdf-941f-99913b969b0f" (UID: "35b609c4-d03c-4cdf-941f-99913b969b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:42 crc kubenswrapper[4755]: W1006 08:40:42.934102 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48b49ce4_e2fc_4393_9de1_b556f7fbd7eb.slice/crio-23238e573a1436617392b0c2adccdc5ffbc0ae539daff6fa3dbb461d8dc1c376 WatchSource:0}: Error finding container 23238e573a1436617392b0c2adccdc5ffbc0ae539daff6fa3dbb461d8dc1c376: Status 404 returned error can't find the container with id 23238e573a1436617392b0c2adccdc5ffbc0ae539daff6fa3dbb461d8dc1c376 Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.963079 4755 scope.go:117] "RemoveContainer" containerID="f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5" Oct 06 08:40:42 crc kubenswrapper[4755]: I1006 08:40:42.968452 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.000924 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b609c4-d03c-4cdf-941f-99913b969b0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.047728 4755 scope.go:117] "RemoveContainer" containerID="97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df" Oct 06 08:40:43 crc kubenswrapper[4755]: E1006 08:40:43.048124 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df\": container with ID starting with 97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df not found: ID does not exist" containerID="97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048157 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df"} err="failed to get container status \"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df\": rpc error: code = NotFound desc = could not find container \"97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df\": container with ID starting with 97f32715f7e9c4e7a76b749adc785d2583bf623fbadd1d6d4535b99983f709df not found: ID does not exist" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048183 4755 scope.go:117] "RemoveContainer" containerID="e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7" Oct 06 08:40:43 crc kubenswrapper[4755]: E1006 08:40:43.048455 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7\": container with ID starting with e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7 not found: ID does not exist" containerID="e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048473 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7"} err="failed to get container status \"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7\": rpc error: code = NotFound desc = could not find container \"e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7\": container with ID starting with e02dda8ba0b178c2ff5c788b3f73f2ee3fcda64678b3bdfdcfdbed36f711e3b7 not found: ID does not exist" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048487 4755 scope.go:117] "RemoveContainer" containerID="077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e" Oct 06 08:40:43 crc kubenswrapper[4755]: E1006 08:40:43.048784 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e\": container with ID starting with 077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e not found: ID does not exist" containerID="077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048813 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e"} err="failed to get container status \"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e\": rpc error: code = NotFound desc = could not find container \"077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e\": container with ID starting with 077ca37ba58734160a5affda7ccce5c912f96172bc9ce8230e120ab7c7e5cc9e not found: ID does not exist" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.048827 4755 scope.go:117] "RemoveContainer" containerID="f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5" Oct 06 08:40:43 crc kubenswrapper[4755]: E1006 08:40:43.049103 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5\": container with ID starting with f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5 not found: ID does not exist" containerID="f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.049129 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5"} err="failed to get container status \"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5\": rpc error: code = NotFound desc = could not find container \"f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5\": container with ID starting with f1997516c63c17a3e38f46cefb38aca043014748748967f4d29de2a1582e54c5 not found: ID does not exist" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.124846 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.187904 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.205893 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.311942 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.319777 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.333784 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.337064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.339720 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.340035 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.340302 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.343282 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.505669 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f45x"] Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510343 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.510798 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.511347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltv6v\" (UniqueName: \"kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: W1006 08:40:43.521384 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca576ccd_2a13_4b2c_ab8e_df22112b4711.slice/crio-0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63 WatchSource:0}: Error finding container 0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63: Status 404 returned error can't find the container with id 0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63 Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.616272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.616715 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a313ac16-c355-4334-b2dc-3da3b3229062","Type":"ContainerStarted","Data":"c58902d599c551142e44afadbf1b34cefb7ed810f401912cdd6ccab6f5674a7b"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.617604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.617820 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.617845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.618078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.618618 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.618886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.619816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.621421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.621601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltv6v\" (UniqueName: \"kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.621626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.622533 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.623306 4755 generic.go:334] "Generic (PLEG): container finished" podID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerID="ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9" exitCode=0 Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.623533 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" event={"ID":"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a","Type":"ContainerDied","Data":"ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.623705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" event={"ID":"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a","Type":"ContainerStarted","Data":"f9e621997d2dbbfedb619af74a36de0a3d5a0cb52c995454acca54e04a60681d"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.625473 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.626134 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.652943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183dc39f-4089-4993-b806-0c8a6a76c58a","Type":"ContainerStarted","Data":"c7c01112051c18413b4c3123e0d770bdec3fed1fce5bf63941ec5ce3f69e9b3d"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.653257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.656167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfkm6" event={"ID":"ce67a97c-6bfd-4684-be25-c82eec5f8237","Type":"ContainerStarted","Data":"2b8ba3be8bf9e372d59574f01a2aafd9839e1f4bf9178e3242945ac088719f55"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.662628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f45x" event={"ID":"ca576ccd-2a13-4b2c-ab8e-df22112b4711","Type":"ContainerStarted","Data":"0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.671328 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerStarted","Data":"23238e573a1436617392b0c2adccdc5ffbc0ae539daff6fa3dbb461d8dc1c376"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.677738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerStarted","Data":"dd694b9ce1e8605b9dfc46b5e2973c9044a0c2f6a86c1e24afa3e2c3983b09b0"} Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.686885 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltv6v\" (UniqueName: \"kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v\") pod \"ceilometer-0\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.694254 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wfkm6" podStartSLOduration=2.694237064 podStartE2EDuration="2.694237064s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:43.688206471 +0000 UTC m=+1100.517521705" watchObservedRunningTime="2025-10-06 08:40:43.694237064 +0000 UTC m=+1100.523552278" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.751188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:43 crc kubenswrapper[4755]: I1006 08:40:43.947803 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b609c4-d03c-4cdf-941f-99913b969b0f" path="/var/lib/kubelet/pods/35b609c4-d03c-4cdf-941f-99913b969b0f/volumes" Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.340981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.690582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" event={"ID":"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a","Type":"ContainerStarted","Data":"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9"} Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.691219 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.694991 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f45x" event={"ID":"ca576ccd-2a13-4b2c-ab8e-df22112b4711","Type":"ContainerStarted","Data":"8aac7e5406185c0e45a14415ceb0bf3f16b3de7488b4f5a15ee14ac46996736f"} Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.717823 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" podStartSLOduration=3.717805462 podStartE2EDuration="3.717805462s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:44.708048404 +0000 UTC m=+1101.537363618" watchObservedRunningTime="2025-10-06 08:40:44.717805462 +0000 UTC m=+1101.547120676" Oct 06 08:40:44 crc kubenswrapper[4755]: I1006 08:40:44.740539 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8f45x" podStartSLOduration=2.74051009 podStartE2EDuration="2.74051009s" podCreationTimestamp="2025-10-06 08:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:44.730969767 +0000 UTC m=+1101.560284991" watchObservedRunningTime="2025-10-06 08:40:44.74051009 +0000 UTC m=+1101.569825304" Oct 06 08:40:45 crc kubenswrapper[4755]: W1006 08:40:45.188710 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87511693_db15_4fef_b3f8_a48e99ddfb0b.slice/crio-368fcee0b92826739d697260d0674f6fbdfe233de321437487511307c9d8ef9e WatchSource:0}: Error finding container 368fcee0b92826739d697260d0674f6fbdfe233de321437487511307c9d8ef9e: Status 404 returned error can't find the container with id 368fcee0b92826739d697260d0674f6fbdfe233de321437487511307c9d8ef9e Oct 06 08:40:45 crc kubenswrapper[4755]: I1006 08:40:45.206488 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:45 crc kubenswrapper[4755]: I1006 08:40:45.246508 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:45 crc kubenswrapper[4755]: I1006 08:40:45.704600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerStarted","Data":"368fcee0b92826739d697260d0674f6fbdfe233de321437487511307c9d8ef9e"} Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.718036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerStarted","Data":"62a97897acc65ad0eea6e88efed9f52644cb2a964c8e2072c5a69fb68e1ff7be"} Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.721309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a313ac16-c355-4334-b2dc-3da3b3229062","Type":"ContainerStarted","Data":"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469"} Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.723172 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="183dc39f-4089-4993-b806-0c8a6a76c58a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e" gracePeriod=30 Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.723191 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183dc39f-4089-4993-b806-0c8a6a76c58a","Type":"ContainerStarted","Data":"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e"} Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.728973 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerStarted","Data":"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9"} Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.740203 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.552284588 podStartE2EDuration="5.740184419s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="2025-10-06 08:40:43.143011032 +0000 UTC m=+1099.972326246" lastFinishedPulling="2025-10-06 08:40:46.330910863 +0000 UTC m=+1103.160226077" observedRunningTime="2025-10-06 08:40:46.739964114 +0000 UTC m=+1103.569279328" watchObservedRunningTime="2025-10-06 08:40:46.740184419 +0000 UTC m=+1103.569499633" Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.768886 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.168154672 podStartE2EDuration="5.768862091s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="2025-10-06 08:40:42.729618819 +0000 UTC m=+1099.558934033" lastFinishedPulling="2025-10-06 08:40:46.330326238 +0000 UTC m=+1103.159641452" observedRunningTime="2025-10-06 08:40:46.760040576 +0000 UTC m=+1103.589355790" watchObservedRunningTime="2025-10-06 08:40:46.768862091 +0000 UTC m=+1103.598177305" Oct 06 08:40:46 crc kubenswrapper[4755]: I1006 08:40:46.908133 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.068756 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.087053 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.742544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerStarted","Data":"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d"} Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.747535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerStarted","Data":"4031ca26b25c3fa4fe7efa088d8541608f19ba99fdb5e3a93d2e03acde1582b6"} Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.757977 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-log" containerID="cri-o://62a97897acc65ad0eea6e88efed9f52644cb2a964c8e2072c5a69fb68e1ff7be" gracePeriod=30 Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.758232 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerStarted","Data":"5cf920348c3c8e57587dff0f1b9fdb867ecf72ea22190d4643c7e60400b43013"} Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.758296 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-metadata" containerID="cri-o://5cf920348c3c8e57587dff0f1b9fdb867ecf72ea22190d4643c7e60400b43013" gracePeriod=30 Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.768112 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.373587685 podStartE2EDuration="6.768096526s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="2025-10-06 08:40:42.936426213 +0000 UTC m=+1099.765741427" lastFinishedPulling="2025-10-06 08:40:46.330935054 +0000 UTC m=+1103.160250268" observedRunningTime="2025-10-06 08:40:47.763809436 +0000 UTC m=+1104.593124650" watchObservedRunningTime="2025-10-06 08:40:47.768096526 +0000 UTC m=+1104.597411730" Oct 06 08:40:47 crc kubenswrapper[4755]: I1006 08:40:47.787404 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.594894171 podStartE2EDuration="6.787381168s" podCreationTimestamp="2025-10-06 08:40:41 +0000 UTC" firstStartedPulling="2025-10-06 08:40:43.142692425 +0000 UTC m=+1099.972007639" lastFinishedPulling="2025-10-06 08:40:46.335179422 +0000 UTC m=+1103.164494636" observedRunningTime="2025-10-06 08:40:47.786230529 +0000 UTC m=+1104.615545753" watchObservedRunningTime="2025-10-06 08:40:47.787381168 +0000 UTC m=+1104.616696382" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.773401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerStarted","Data":"520aaa97134755f260b7a1fafad978d709e978280fa4d7d2d417d181da9efb6b"} Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777301 4755 generic.go:334] "Generic (PLEG): container finished" podID="39613549-339a-4234-8786-b4dc19d9ceee" containerID="5cf920348c3c8e57587dff0f1b9fdb867ecf72ea22190d4643c7e60400b43013" exitCode=0 Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777334 4755 generic.go:334] "Generic (PLEG): container finished" podID="39613549-339a-4234-8786-b4dc19d9ceee" containerID="62a97897acc65ad0eea6e88efed9f52644cb2a964c8e2072c5a69fb68e1ff7be" exitCode=143 Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerDied","Data":"5cf920348c3c8e57587dff0f1b9fdb867ecf72ea22190d4643c7e60400b43013"} Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777411 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerDied","Data":"62a97897acc65ad0eea6e88efed9f52644cb2a964c8e2072c5a69fb68e1ff7be"} Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39613549-339a-4234-8786-b4dc19d9ceee","Type":"ContainerDied","Data":"dd694b9ce1e8605b9dfc46b5e2973c9044a0c2f6a86c1e24afa3e2c3983b09b0"} Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.777440 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd694b9ce1e8605b9dfc46b5e2973c9044a0c2f6a86c1e24afa3e2c3983b09b0" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.844598 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.932856 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs\") pod \"39613549-339a-4234-8786-b4dc19d9ceee\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.932982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle\") pod \"39613549-339a-4234-8786-b4dc19d9ceee\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.933200 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs" (OuterVolumeSpecName: "logs") pod "39613549-339a-4234-8786-b4dc19d9ceee" (UID: "39613549-339a-4234-8786-b4dc19d9ceee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.933287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9fb9\" (UniqueName: \"kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9\") pod \"39613549-339a-4234-8786-b4dc19d9ceee\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.933331 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data\") pod \"39613549-339a-4234-8786-b4dc19d9ceee\" (UID: \"39613549-339a-4234-8786-b4dc19d9ceee\") " Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.934081 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39613549-339a-4234-8786-b4dc19d9ceee-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.938958 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9" (OuterVolumeSpecName: "kube-api-access-d9fb9") pod "39613549-339a-4234-8786-b4dc19d9ceee" (UID: "39613549-339a-4234-8786-b4dc19d9ceee"). InnerVolumeSpecName "kube-api-access-d9fb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.967366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data" (OuterVolumeSpecName: "config-data") pod "39613549-339a-4234-8786-b4dc19d9ceee" (UID: "39613549-339a-4234-8786-b4dc19d9ceee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:48 crc kubenswrapper[4755]: I1006 08:40:48.983689 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39613549-339a-4234-8786-b4dc19d9ceee" (UID: "39613549-339a-4234-8786-b4dc19d9ceee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.038590 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.038625 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9fb9\" (UniqueName: \"kubernetes.io/projected/39613549-339a-4234-8786-b4dc19d9ceee-kube-api-access-d9fb9\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.038634 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39613549-339a-4234-8786-b4dc19d9ceee-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.790057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerStarted","Data":"a29021b326428b7a6a7336f78794803385c35f381705af46ce931b01ac90ca9b"} Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.790086 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.826247 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.833810 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.862252 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:49 crc kubenswrapper[4755]: E1006 08:40:49.862803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-log" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.862817 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-log" Oct 06 08:40:49 crc kubenswrapper[4755]: E1006 08:40:49.862834 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-metadata" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.862840 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-metadata" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.863041 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-metadata" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.863055 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="39613549-339a-4234-8786-b4dc19d9ceee" containerName="nova-metadata-log" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.864033 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.867383 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.884201 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.898815 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39613549-339a-4234-8786-b4dc19d9ceee" path="/var/lib/kubelet/pods/39613549-339a-4234-8786-b4dc19d9ceee/volumes" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.899479 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.953443 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.953488 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.953510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.953620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:49 crc kubenswrapper[4755]: I1006 08:40:49.953638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf5lc\" (UniqueName: \"kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.054924 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.054986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf5lc\" (UniqueName: \"kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.055094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.055123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.055149 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.055923 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.067088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.067731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.069217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.076699 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf5lc\" (UniqueName: \"kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc\") pod \"nova-metadata-0\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.189259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.725990 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:50 crc kubenswrapper[4755]: W1006 08:40:50.729000 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e954f36_4ba9_4de9_9289_206daeafc79f.slice/crio-86ff1afce7bf135ddfcf22ddec74d331eec143bebbe44de95811c2b58c6fe21d WatchSource:0}: Error finding container 86ff1afce7bf135ddfcf22ddec74d331eec143bebbe44de95811c2b58c6fe21d: Status 404 returned error can't find the container with id 86ff1afce7bf135ddfcf22ddec74d331eec143bebbe44de95811c2b58c6fe21d Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.800687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerStarted","Data":"86ff1afce7bf135ddfcf22ddec74d331eec143bebbe44de95811c2b58c6fe21d"} Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.804801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerStarted","Data":"977b3dbaa3801356a9cef6ba013b8c9187342f9f407f6ad4427e214af08ea598"} Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.804989 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:50 crc kubenswrapper[4755]: I1006 08:40:50.830113 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.873680236 podStartE2EDuration="7.83009308s" podCreationTimestamp="2025-10-06 08:40:43 +0000 UTC" firstStartedPulling="2025-10-06 08:40:46.233178839 +0000 UTC m=+1103.062494053" lastFinishedPulling="2025-10-06 08:40:50.189591683 +0000 UTC m=+1107.018906897" observedRunningTime="2025-10-06 08:40:50.826788646 +0000 UTC m=+1107.656103870" watchObservedRunningTime="2025-10-06 08:40:50.83009308 +0000 UTC m=+1107.659408294" Oct 06 08:40:51 crc kubenswrapper[4755]: I1006 08:40:51.815959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerStarted","Data":"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89"} Oct 06 08:40:51 crc kubenswrapper[4755]: I1006 08:40:51.816525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerStarted","Data":"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567"} Oct 06 08:40:51 crc kubenswrapper[4755]: I1006 08:40:51.818549 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce67a97c-6bfd-4684-be25-c82eec5f8237" containerID="2b8ba3be8bf9e372d59574f01a2aafd9839e1f4bf9178e3242945ac088719f55" exitCode=0 Oct 06 08:40:51 crc kubenswrapper[4755]: I1006 08:40:51.818695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfkm6" event={"ID":"ce67a97c-6bfd-4684-be25-c82eec5f8237","Type":"ContainerDied","Data":"2b8ba3be8bf9e372d59574f01a2aafd9839e1f4bf9178e3242945ac088719f55"} Oct 06 08:40:51 crc kubenswrapper[4755]: I1006 08:40:51.843239 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.84321955 podStartE2EDuration="2.84321955s" podCreationTimestamp="2025-10-06 08:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:51.830505005 +0000 UTC m=+1108.659820219" watchObservedRunningTime="2025-10-06 08:40:51.84321955 +0000 UTC m=+1108.672534764" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.069295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.099601 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.129948 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.130044 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.343880 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.441587 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.441839 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="dnsmasq-dns" containerID="cri-o://54d3032fb98e75f87d1d15129afc65687b122c385be02c187f1f29a22bfea33e" gracePeriod=10 Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.842426 4755 generic.go:334] "Generic (PLEG): container finished" podID="e18674b4-2633-4819-8e4c-81e122186c0b" containerID="54d3032fb98e75f87d1d15129afc65687b122c385be02c187f1f29a22bfea33e" exitCode=0 Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.843065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" event={"ID":"e18674b4-2633-4819-8e4c-81e122186c0b","Type":"ContainerDied","Data":"54d3032fb98e75f87d1d15129afc65687b122c385be02c187f1f29a22bfea33e"} Oct 06 08:40:52 crc kubenswrapper[4755]: I1006 08:40:52.874465 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.050623 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.132957 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksc6b\" (UniqueName: \"kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b\") pod \"e18674b4-2633-4819-8e4c-81e122186c0b\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.133020 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config\") pod \"e18674b4-2633-4819-8e4c-81e122186c0b\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.133153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb\") pod \"e18674b4-2633-4819-8e4c-81e122186c0b\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.133232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb\") pod \"e18674b4-2633-4819-8e4c-81e122186c0b\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.133346 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc\") pod \"e18674b4-2633-4819-8e4c-81e122186c0b\" (UID: \"e18674b4-2633-4819-8e4c-81e122186c0b\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.150849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b" (OuterVolumeSpecName: "kube-api-access-ksc6b") pod "e18674b4-2633-4819-8e4c-81e122186c0b" (UID: "e18674b4-2633-4819-8e4c-81e122186c0b"). InnerVolumeSpecName "kube-api-access-ksc6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.208339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config" (OuterVolumeSpecName: "config") pod "e18674b4-2633-4819-8e4c-81e122186c0b" (UID: "e18674b4-2633-4819-8e4c-81e122186c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.212840 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e18674b4-2633-4819-8e4c-81e122186c0b" (UID: "e18674b4-2633-4819-8e4c-81e122186c0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.219872 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.220073 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.227300 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e18674b4-2633-4819-8e4c-81e122186c0b" (UID: "e18674b4-2633-4819-8e4c-81e122186c0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.237515 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.237639 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.237674 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksc6b\" (UniqueName: \"kubernetes.io/projected/e18674b4-2633-4819-8e4c-81e122186c0b-kube-api-access-ksc6b\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.237757 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.249879 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.260679 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e18674b4-2633-4819-8e4c-81e122186c0b" (UID: "e18674b4-2633-4819-8e4c-81e122186c0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.338843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle\") pod \"ce67a97c-6bfd-4684-be25-c82eec5f8237\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.338890 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5hfw\" (UniqueName: \"kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw\") pod \"ce67a97c-6bfd-4684-be25-c82eec5f8237\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.339071 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data\") pod \"ce67a97c-6bfd-4684-be25-c82eec5f8237\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.339118 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts\") pod \"ce67a97c-6bfd-4684-be25-c82eec5f8237\" (UID: \"ce67a97c-6bfd-4684-be25-c82eec5f8237\") " Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.339499 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e18674b4-2633-4819-8e4c-81e122186c0b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.342618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts" (OuterVolumeSpecName: "scripts") pod "ce67a97c-6bfd-4684-be25-c82eec5f8237" (UID: "ce67a97c-6bfd-4684-be25-c82eec5f8237"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.342809 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw" (OuterVolumeSpecName: "kube-api-access-r5hfw") pod "ce67a97c-6bfd-4684-be25-c82eec5f8237" (UID: "ce67a97c-6bfd-4684-be25-c82eec5f8237"). InnerVolumeSpecName "kube-api-access-r5hfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.363762 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce67a97c-6bfd-4684-be25-c82eec5f8237" (UID: "ce67a97c-6bfd-4684-be25-c82eec5f8237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.370447 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data" (OuterVolumeSpecName: "config-data") pod "ce67a97c-6bfd-4684-be25-c82eec5f8237" (UID: "ce67a97c-6bfd-4684-be25-c82eec5f8237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.441106 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.441150 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5hfw\" (UniqueName: \"kubernetes.io/projected/ce67a97c-6bfd-4684-be25-c82eec5f8237-kube-api-access-r5hfw\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.441163 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.441173 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67a97c-6bfd-4684-be25-c82eec5f8237-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.854284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" event={"ID":"e18674b4-2633-4819-8e4c-81e122186c0b","Type":"ContainerDied","Data":"667332f85c4483a39343b8fcfd3d9f7d68689c5f7d4efe6916661f27cc5d281a"} Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.854297 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-989gj" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.854371 4755 scope.go:117] "RemoveContainer" containerID="54d3032fb98e75f87d1d15129afc65687b122c385be02c187f1f29a22bfea33e" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.867727 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfkm6" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.867707 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfkm6" event={"ID":"ce67a97c-6bfd-4684-be25-c82eec5f8237","Type":"ContainerDied","Data":"5cd9a3abe907b573a0817da845b5f15e9dc6f22eb2d7e14a0b9dcfc732e2c317"} Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.867786 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd9a3abe907b573a0817da845b5f15e9dc6f22eb2d7e14a0b9dcfc732e2c317" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.872258 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca576ccd-2a13-4b2c-ab8e-df22112b4711" containerID="8aac7e5406185c0e45a14415ceb0bf3f16b3de7488b4f5a15ee14ac46996736f" exitCode=0 Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.872336 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f45x" event={"ID":"ca576ccd-2a13-4b2c-ab8e-df22112b4711","Type":"ContainerDied","Data":"8aac7e5406185c0e45a14415ceb0bf3f16b3de7488b4f5a15ee14ac46996736f"} Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.898843 4755 scope.go:117] "RemoveContainer" containerID="a3d8abe09ae4852061ef69995bb95d944415a4594c9fb66cd0b955b48752f742" Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.905174 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:40:53 crc kubenswrapper[4755]: I1006 08:40:53.912982 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-989gj"] Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.128728 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.128950 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-log" containerID="cri-o://1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9" gracePeriod=30 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.129028 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-api" containerID="cri-o://c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d" gracePeriod=30 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.215264 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.235484 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.235811 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-log" containerID="cri-o://90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" gracePeriod=30 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.236648 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-metadata" containerID="cri-o://88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" gracePeriod=30 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.786453 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.870876 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle\") pod \"1e954f36-4ba9-4de9-9289-206daeafc79f\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.871013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data\") pod \"1e954f36-4ba9-4de9-9289-206daeafc79f\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.871112 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs\") pod \"1e954f36-4ba9-4de9-9289-206daeafc79f\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.871146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf5lc\" (UniqueName: \"kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc\") pod \"1e954f36-4ba9-4de9-9289-206daeafc79f\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.871191 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs\") pod \"1e954f36-4ba9-4de9-9289-206daeafc79f\" (UID: \"1e954f36-4ba9-4de9-9289-206daeafc79f\") " Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.872072 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs" (OuterVolumeSpecName: "logs") pod "1e954f36-4ba9-4de9-9289-206daeafc79f" (UID: "1e954f36-4ba9-4de9-9289-206daeafc79f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.876283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc" (OuterVolumeSpecName: "kube-api-access-sf5lc") pod "1e954f36-4ba9-4de9-9289-206daeafc79f" (UID: "1e954f36-4ba9-4de9-9289-206daeafc79f"). InnerVolumeSpecName "kube-api-access-sf5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.903618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e954f36-4ba9-4de9-9289-206daeafc79f" (UID: "1e954f36-4ba9-4de9-9289-206daeafc79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907483 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerID="88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" exitCode=0 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907552 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerID="90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" exitCode=143 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907684 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerDied","Data":"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89"} Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907762 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerDied","Data":"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567"} Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907768 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.907782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e954f36-4ba9-4de9-9289-206daeafc79f","Type":"ContainerDied","Data":"86ff1afce7bf135ddfcf22ddec74d331eec143bebbe44de95811c2b58c6fe21d"} Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.909102 4755 scope.go:117] "RemoveContainer" containerID="88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.916411 4755 generic.go:334] "Generic (PLEG): container finished" podID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerID="1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9" exitCode=143 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.916530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerDied","Data":"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9"} Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.933724 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" containerName="nova-scheduler-scheduler" containerID="cri-o://7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" gracePeriod=30 Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.935261 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data" (OuterVolumeSpecName: "config-data") pod "1e954f36-4ba9-4de9-9289-206daeafc79f" (UID: "1e954f36-4ba9-4de9-9289-206daeafc79f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.939312 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1e954f36-4ba9-4de9-9289-206daeafc79f" (UID: "1e954f36-4ba9-4de9-9289-206daeafc79f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.973110 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.973146 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.973162 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e954f36-4ba9-4de9-9289-206daeafc79f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.973180 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf5lc\" (UniqueName: \"kubernetes.io/projected/1e954f36-4ba9-4de9-9289-206daeafc79f-kube-api-access-sf5lc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:54 crc kubenswrapper[4755]: I1006 08:40:54.973195 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e954f36-4ba9-4de9-9289-206daeafc79f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.007627 4755 scope.go:117] "RemoveContainer" containerID="90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.035002 4755 scope.go:117] "RemoveContainer" containerID="88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.044083 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89\": container with ID starting with 88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89 not found: ID does not exist" containerID="88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.044124 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89"} err="failed to get container status \"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89\": rpc error: code = NotFound desc = could not find container \"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89\": container with ID starting with 88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89 not found: ID does not exist" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.044154 4755 scope.go:117] "RemoveContainer" containerID="90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.044910 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567\": container with ID starting with 90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567 not found: ID does not exist" containerID="90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.044931 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567"} err="failed to get container status \"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567\": rpc error: code = NotFound desc = could not find container \"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567\": container with ID starting with 90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567 not found: ID does not exist" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.044945 4755 scope.go:117] "RemoveContainer" containerID="88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.046231 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89"} err="failed to get container status \"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89\": rpc error: code = NotFound desc = could not find container \"88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89\": container with ID starting with 88332f930e95f986f11b1383f2e895b0393f74641563bb0c2c21800e1ae11a89 not found: ID does not exist" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.046249 4755 scope.go:117] "RemoveContainer" containerID="90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.046429 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567"} err="failed to get container status \"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567\": rpc error: code = NotFound desc = could not find container \"90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567\": container with ID starting with 90f319ab29e7325cf4e9a587bdb670bcbe586abee67623d82fe9e5979167d567 not found: ID does not exist" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.226180 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.269532 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.278648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle\") pod \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.278708 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b44tc\" (UniqueName: \"kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc\") pod \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.278842 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data\") pod \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.278888 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts\") pod \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\" (UID: \"ca576ccd-2a13-4b2c-ab8e-df22112b4711\") " Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.293809 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts" (OuterVolumeSpecName: "scripts") pod "ca576ccd-2a13-4b2c-ab8e-df22112b4711" (UID: "ca576ccd-2a13-4b2c-ab8e-df22112b4711"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.297654 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.297831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc" (OuterVolumeSpecName: "kube-api-access-b44tc") pod "ca576ccd-2a13-4b2c-ab8e-df22112b4711" (UID: "ca576ccd-2a13-4b2c-ab8e-df22112b4711"). InnerVolumeSpecName "kube-api-access-b44tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.309721 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310126 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce67a97c-6bfd-4684-be25-c82eec5f8237" containerName="nova-manage" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310143 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce67a97c-6bfd-4684-be25-c82eec5f8237" containerName="nova-manage" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310158 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="dnsmasq-dns" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="dnsmasq-dns" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310172 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="init" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="init" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310194 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-metadata" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310200 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-metadata" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310209 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-log" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310216 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-log" Oct 06 08:40:55 crc kubenswrapper[4755]: E1006 08:40:55.310232 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca576ccd-2a13-4b2c-ab8e-df22112b4711" containerName="nova-cell1-conductor-db-sync" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310237 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca576ccd-2a13-4b2c-ab8e-df22112b4711" containerName="nova-cell1-conductor-db-sync" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310400 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce67a97c-6bfd-4684-be25-c82eec5f8237" containerName="nova-manage" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310411 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca576ccd-2a13-4b2c-ab8e-df22112b4711" containerName="nova-cell1-conductor-db-sync" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310421 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-metadata" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310432 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" containerName="nova-metadata-log" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.310438 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" containerName="dnsmasq-dns" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.311378 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.314565 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.317092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.317313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.338474 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca576ccd-2a13-4b2c-ab8e-df22112b4711" (UID: "ca576ccd-2a13-4b2c-ab8e-df22112b4711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.348822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data" (OuterVolumeSpecName: "config-data") pod "ca576ccd-2a13-4b2c-ab8e-df22112b4711" (UID: "ca576ccd-2a13-4b2c-ab8e-df22112b4711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383451 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99442\" (UniqueName: \"kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383651 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383746 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383769 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383781 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca576ccd-2a13-4b2c-ab8e-df22112b4711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.383796 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b44tc\" (UniqueName: \"kubernetes.io/projected/ca576ccd-2a13-4b2c-ab8e-df22112b4711-kube-api-access-b44tc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.485352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.485415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.485443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.485502 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.485558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99442\" (UniqueName: \"kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.487890 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.489818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.490587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.501221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.503109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99442\" (UniqueName: \"kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442\") pod \"nova-metadata-0\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.639006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.894242 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e954f36-4ba9-4de9-9289-206daeafc79f" path="/var/lib/kubelet/pods/1e954f36-4ba9-4de9-9289-206daeafc79f/volumes" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.895239 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18674b4-2633-4819-8e4c-81e122186c0b" path="/var/lib/kubelet/pods/e18674b4-2633-4819-8e4c-81e122186c0b/volumes" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.946934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f45x" event={"ID":"ca576ccd-2a13-4b2c-ab8e-df22112b4711","Type":"ContainerDied","Data":"0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63"} Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.946972 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a18c82685106438e754ee8dd2ff5414a92f195446a9b5d7893687cd73a29c63" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.947015 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f45x" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.979783 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.981275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:55 crc kubenswrapper[4755]: I1006 08:40:55.985340 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.001758 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.103785 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.103916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.103984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwn6p\" (UniqueName: \"kubernetes.io/projected/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-kube-api-access-gwn6p\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.113043 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:56 crc kubenswrapper[4755]: W1006 08:40:56.116412 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e657415_bec0_4087_936c_aee106e6e624.slice/crio-641c833398c50cdae9a9374e766c5ac9446b5d1bb2c5e895baa5659f722072bd WatchSource:0}: Error finding container 641c833398c50cdae9a9374e766c5ac9446b5d1bb2c5e895baa5659f722072bd: Status 404 returned error can't find the container with id 641c833398c50cdae9a9374e766c5ac9446b5d1bb2c5e895baa5659f722072bd Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.205466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.205574 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwn6p\" (UniqueName: \"kubernetes.io/projected/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-kube-api-access-gwn6p\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.205978 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.211918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.212475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.221040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwn6p\" (UniqueName: \"kubernetes.io/projected/36bddbbc-1f14-40bd-921e-f5b85fc1b68e-kube-api-access-gwn6p\") pod \"nova-cell1-conductor-0\" (UID: \"36bddbbc-1f14-40bd-921e-f5b85fc1b68e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.302537 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.727780 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:40:56 crc kubenswrapper[4755]: W1006 08:40:56.733090 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36bddbbc_1f14_40bd_921e_f5b85fc1b68e.slice/crio-77adc61039cfcd8e006d543074899d0288784ca2744f940e782989ebe5c05729 WatchSource:0}: Error finding container 77adc61039cfcd8e006d543074899d0288784ca2744f940e782989ebe5c05729: Status 404 returned error can't find the container with id 77adc61039cfcd8e006d543074899d0288784ca2744f940e782989ebe5c05729 Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.957802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"36bddbbc-1f14-40bd-921e-f5b85fc1b68e","Type":"ContainerStarted","Data":"360d09b0d8a9c80116590f6a0da27b5ccea5c5a2d69f0fb07d837c4aa9c11a25"} Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.958322 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.958366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"36bddbbc-1f14-40bd-921e-f5b85fc1b68e","Type":"ContainerStarted","Data":"77adc61039cfcd8e006d543074899d0288784ca2744f940e782989ebe5c05729"} Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.960214 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerStarted","Data":"cb9d2ce7c5f4e55395eb161409704dc26c35093116bb43f11634f9025660af46"} Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.960242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerStarted","Data":"35c3a15be3296878f4ac01a36c97a4ee83bffb62f920f88b9ecfe54edb3719a8"} Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.960253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerStarted","Data":"641c833398c50cdae9a9374e766c5ac9446b5d1bb2c5e895baa5659f722072bd"} Oct 06 08:40:56 crc kubenswrapper[4755]: I1006 08:40:56.991171 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.991145465 podStartE2EDuration="1.991145465s" podCreationTimestamp="2025-10-06 08:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:56.983927731 +0000 UTC m=+1113.813242945" watchObservedRunningTime="2025-10-06 08:40:56.991145465 +0000 UTC m=+1113.820460679" Oct 06 08:40:57 crc kubenswrapper[4755]: I1006 08:40:57.007172 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.007154534 podStartE2EDuration="2.007154534s" podCreationTimestamp="2025-10-06 08:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:57.003336227 +0000 UTC m=+1113.832651441" watchObservedRunningTime="2025-10-06 08:40:57.007154534 +0000 UTC m=+1113.836469748" Oct 06 08:40:57 crc kubenswrapper[4755]: E1006 08:40:57.070559 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:40:57 crc kubenswrapper[4755]: E1006 08:40:57.072187 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:40:57 crc kubenswrapper[4755]: E1006 08:40:57.073853 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:40:57 crc kubenswrapper[4755]: E1006 08:40:57.073896 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" containerName="nova-scheduler-scheduler" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.480390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.549124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle\") pod \"a313ac16-c355-4334-b2dc-3da3b3229062\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.549214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snpcj\" (UniqueName: \"kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj\") pod \"a313ac16-c355-4334-b2dc-3da3b3229062\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.549270 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data\") pod \"a313ac16-c355-4334-b2dc-3da3b3229062\" (UID: \"a313ac16-c355-4334-b2dc-3da3b3229062\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.556727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj" (OuterVolumeSpecName: "kube-api-access-snpcj") pod "a313ac16-c355-4334-b2dc-3da3b3229062" (UID: "a313ac16-c355-4334-b2dc-3da3b3229062"). InnerVolumeSpecName "kube-api-access-snpcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.574969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data" (OuterVolumeSpecName: "config-data") pod "a313ac16-c355-4334-b2dc-3da3b3229062" (UID: "a313ac16-c355-4334-b2dc-3da3b3229062"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.584248 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a313ac16-c355-4334-b2dc-3da3b3229062" (UID: "a313ac16-c355-4334-b2dc-3da3b3229062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.651700 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.651737 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a313ac16-c355-4334-b2dc-3da3b3229062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.651749 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snpcj\" (UniqueName: \"kubernetes.io/projected/a313ac16-c355-4334-b2dc-3da3b3229062-kube-api-access-snpcj\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.921043 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.956245 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bjr\" (UniqueName: \"kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr\") pod \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.956317 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs\") pod \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.956470 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle\") pod \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.956509 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data\") pod \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\" (UID: \"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb\") " Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.958452 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs" (OuterVolumeSpecName: "logs") pod "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" (UID: "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.963517 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr" (OuterVolumeSpecName: "kube-api-access-h7bjr") pod "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" (UID: "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb"). InnerVolumeSpecName "kube-api-access-h7bjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.990840 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data" (OuterVolumeSpecName: "config-data") pod "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" (UID: "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.996410 4755 generic.go:334] "Generic (PLEG): container finished" podID="a313ac16-c355-4334-b2dc-3da3b3229062" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" exitCode=0 Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.996503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a313ac16-c355-4334-b2dc-3da3b3229062","Type":"ContainerDied","Data":"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469"} Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.996537 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a313ac16-c355-4334-b2dc-3da3b3229062","Type":"ContainerDied","Data":"c58902d599c551142e44afadbf1b34cefb7ed810f401912cdd6ccab6f5674a7b"} Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.996561 4755 scope.go:117] "RemoveContainer" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.996718 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:58 crc kubenswrapper[4755]: I1006 08:40:58.999877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" (UID: "48b49ce4-e2fc-4393-9de1-b556f7fbd7eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.000835 4755 generic.go:334] "Generic (PLEG): container finished" podID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerID="c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d" exitCode=0 Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.000878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerDied","Data":"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d"} Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.000908 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48b49ce4-e2fc-4393-9de1-b556f7fbd7eb","Type":"ContainerDied","Data":"23238e573a1436617392b0c2adccdc5ffbc0ae539daff6fa3dbb461d8dc1c376"} Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.000975 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.042384 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.042542 4755 scope.go:117] "RemoveContainer" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.047383 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469\": container with ID starting with 7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469 not found: ID does not exist" containerID="7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.047429 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469"} err="failed to get container status \"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469\": rpc error: code = NotFound desc = could not find container \"7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469\": container with ID starting with 7fb7253a618bd5f1fe49a9b1d861f22841272bc3baedf0816d06bbf63b3b5469 not found: ID does not exist" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.047454 4755 scope.go:117] "RemoveContainer" containerID="c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.058771 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bjr\" (UniqueName: \"kubernetes.io/projected/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-kube-api-access-h7bjr\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.058805 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.058818 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.058826 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.081670 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.105231 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.121817 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.121939 4755 scope.go:117] "RemoveContainer" containerID="1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131153 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.131643 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" containerName="nova-scheduler-scheduler" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131667 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" containerName="nova-scheduler-scheduler" Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.131682 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-api" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131690 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-api" Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.131728 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-log" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131736 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-log" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131916 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-log" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131931 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" containerName="nova-scheduler-scheduler" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.131943 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" containerName="nova-api-api" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.133079 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.136428 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.140880 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.142530 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.143392 4755 scope.go:117] "RemoveContainer" containerID="c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d" Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.143899 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d\": container with ID starting with c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d not found: ID does not exist" containerID="c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.143932 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d"} err="failed to get container status \"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d\": rpc error: code = NotFound desc = could not find container \"c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d\": container with ID starting with c043b77332c8f73f114438b659342a72150aafed35cfe28b254e3ef0d7ea2c8d not found: ID does not exist" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.143961 4755 scope.go:117] "RemoveContainer" containerID="1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9" Oct 06 08:40:59 crc kubenswrapper[4755]: E1006 08:40:59.144433 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9\": container with ID starting with 1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9 not found: ID does not exist" containerID="1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.144459 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9"} err="failed to get container status \"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9\": rpc error: code = NotFound desc = could not find container \"1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9\": container with ID starting with 1078790298f9dcbdb42a2385e6f84b2d1fbb0760019bac6563bfab9262024ed9 not found: ID does not exist" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.144609 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.153265 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160108 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160175 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flqh6\" (UniqueName: \"kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160426 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g2t8\" (UniqueName: \"kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.160492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.165781 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.261458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g2t8\" (UniqueName: \"kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.261901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.262062 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.262260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.262370 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.262511 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.262645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flqh6\" (UniqueName: \"kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.263121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.266985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.267249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.267534 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.268286 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.281424 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g2t8\" (UniqueName: \"kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8\") pod \"nova-scheduler-0\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.283741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flqh6\" (UniqueName: \"kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6\") pod \"nova-api-0\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.453132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.463838 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.888347 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b49ce4-e2fc-4393-9de1-b556f7fbd7eb" path="/var/lib/kubelet/pods/48b49ce4-e2fc-4393-9de1-b556f7fbd7eb/volumes" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.889394 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a313ac16-c355-4334-b2dc-3da3b3229062" path="/var/lib/kubelet/pods/a313ac16-c355-4334-b2dc-3da3b3229062/volumes" Oct 06 08:40:59 crc kubenswrapper[4755]: I1006 08:40:59.943409 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:00 crc kubenswrapper[4755]: I1006 08:41:00.006772 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:00 crc kubenswrapper[4755]: I1006 08:41:00.012334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b4fe16f-996e-4563-9d52-0d324aad3eb5","Type":"ContainerStarted","Data":"202423b1103284de6c91a61ccce3644dc9c95f0d39edcce8c3fc8c3ab5c44064"} Oct 06 08:41:00 crc kubenswrapper[4755]: I1006 08:41:00.640431 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:00 crc kubenswrapper[4755]: I1006 08:41:00.641089 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.028407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerStarted","Data":"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563"} Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.029238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerStarted","Data":"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37"} Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.029906 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerStarted","Data":"5fe15638b1a60cfb0d01c63a27fb33c642daa1a375eb96b43da44c1ecebc2943"} Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.031679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b4fe16f-996e-4563-9d52-0d324aad3eb5","Type":"ContainerStarted","Data":"c659f6a071b4d2c7eac56c39d72dae7e4437f76987b984a342bd50e39835573b"} Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.053773 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.05375135 podStartE2EDuration="2.05375135s" podCreationTimestamp="2025-10-06 08:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:01.047193922 +0000 UTC m=+1117.876509146" watchObservedRunningTime="2025-10-06 08:41:01.05375135 +0000 UTC m=+1117.883066564" Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.074371 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.074351836 podStartE2EDuration="2.074351836s" podCreationTimestamp="2025-10-06 08:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:01.063887998 +0000 UTC m=+1117.893203212" watchObservedRunningTime="2025-10-06 08:41:01.074351836 +0000 UTC m=+1117.903667050" Oct 06 08:41:01 crc kubenswrapper[4755]: I1006 08:41:01.328464 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:04 crc kubenswrapper[4755]: I1006 08:41:04.464240 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:41:05 crc kubenswrapper[4755]: I1006 08:41:05.640764 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:05 crc kubenswrapper[4755]: I1006 08:41:05.640829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:06 crc kubenswrapper[4755]: I1006 08:41:06.658755 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:06 crc kubenswrapper[4755]: I1006 08:41:06.658772 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:09 crc kubenswrapper[4755]: I1006 08:41:09.454446 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4755]: I1006 08:41:09.455226 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4755]: I1006 08:41:09.464768 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:41:09 crc kubenswrapper[4755]: I1006 08:41:09.498910 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:41:10 crc kubenswrapper[4755]: I1006 08:41:10.135819 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:41:10 crc kubenswrapper[4755]: I1006 08:41:10.539923 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:10 crc kubenswrapper[4755]: I1006 08:41:10.539971 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:13 crc kubenswrapper[4755]: I1006 08:41:13.769536 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:41:15 crc kubenswrapper[4755]: I1006 08:41:15.645427 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:41:15 crc kubenswrapper[4755]: I1006 08:41:15.648407 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:41:15 crc kubenswrapper[4755]: I1006 08:41:15.654146 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:41:16 crc kubenswrapper[4755]: I1006 08:41:16.169050 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.120810 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.174725 4755 generic.go:334] "Generic (PLEG): container finished" podID="183dc39f-4089-4993-b806-0c8a6a76c58a" containerID="7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e" exitCode=137 Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.175591 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.175743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183dc39f-4089-4993-b806-0c8a6a76c58a","Type":"ContainerDied","Data":"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e"} Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.175798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183dc39f-4089-4993-b806-0c8a6a76c58a","Type":"ContainerDied","Data":"c7c01112051c18413b4c3123e0d770bdec3fed1fce5bf63941ec5ce3f69e9b3d"} Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.175825 4755 scope.go:117] "RemoveContainer" containerID="7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.193027 4755 scope.go:117] "RemoveContainer" containerID="7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e" Oct 06 08:41:17 crc kubenswrapper[4755]: E1006 08:41:17.193544 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e\": container with ID starting with 7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e not found: ID does not exist" containerID="7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.193623 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e"} err="failed to get container status \"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e\": rpc error: code = NotFound desc = could not find container \"7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e\": container with ID starting with 7aa602c9a9c5e5e219815aca8a8c35ed14ef6ac6507b2621b108c1076f81142e not found: ID does not exist" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.204307 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle\") pod \"183dc39f-4089-4993-b806-0c8a6a76c58a\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.204407 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44jrx\" (UniqueName: \"kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx\") pod \"183dc39f-4089-4993-b806-0c8a6a76c58a\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.204536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data\") pod \"183dc39f-4089-4993-b806-0c8a6a76c58a\" (UID: \"183dc39f-4089-4993-b806-0c8a6a76c58a\") " Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.209634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx" (OuterVolumeSpecName: "kube-api-access-44jrx") pod "183dc39f-4089-4993-b806-0c8a6a76c58a" (UID: "183dc39f-4089-4993-b806-0c8a6a76c58a"). InnerVolumeSpecName "kube-api-access-44jrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.230385 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "183dc39f-4089-4993-b806-0c8a6a76c58a" (UID: "183dc39f-4089-4993-b806-0c8a6a76c58a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.234350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data" (OuterVolumeSpecName: "config-data") pod "183dc39f-4089-4993-b806-0c8a6a76c58a" (UID: "183dc39f-4089-4993-b806-0c8a6a76c58a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.306224 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.306251 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jrx\" (UniqueName: \"kubernetes.io/projected/183dc39f-4089-4993-b806-0c8a6a76c58a-kube-api-access-44jrx\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.306260 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183dc39f-4089-4993-b806-0c8a6a76c58a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.508282 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.515232 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.531482 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:17 crc kubenswrapper[4755]: E1006 08:41:17.531967 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183dc39f-4089-4993-b806-0c8a6a76c58a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.531989 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="183dc39f-4089-4993-b806-0c8a6a76c58a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.532192 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="183dc39f-4089-4993-b806-0c8a6a76c58a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.532925 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.535624 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.535798 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.535902 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.550038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.611320 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.611715 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.611871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.612411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.612792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqz5m\" (UniqueName: \"kubernetes.io/projected/b9450bb6-dd0a-4168-b34d-239829435571-kube-api-access-vqz5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.714043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.714093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.714135 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqz5m\" (UniqueName: \"kubernetes.io/projected/b9450bb6-dd0a-4168-b34d-239829435571-kube-api-access-vqz5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.714165 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.714208 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.719327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.719497 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.720822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.720868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9450bb6-dd0a-4168-b34d-239829435571-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.738198 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqz5m\" (UniqueName: \"kubernetes.io/projected/b9450bb6-dd0a-4168-b34d-239829435571-kube-api-access-vqz5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9450bb6-dd0a-4168-b34d-239829435571\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.857302 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:17 crc kubenswrapper[4755]: I1006 08:41:17.902366 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183dc39f-4089-4993-b806-0c8a6a76c58a" path="/var/lib/kubelet/pods/183dc39f-4089-4993-b806-0c8a6a76c58a/volumes" Oct 06 08:41:18 crc kubenswrapper[4755]: I1006 08:41:18.321820 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.191649 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9450bb6-dd0a-4168-b34d-239829435571","Type":"ContainerStarted","Data":"33b005ca2fdac2fbf62fd9917b268328d0e74f81559ad08f9ae342de180949be"} Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.191970 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9450bb6-dd0a-4168-b34d-239829435571","Type":"ContainerStarted","Data":"7ce126b1d802b3da7c386b9ca4bb6917d07bb5f87c9277743d74a7470ff00c13"} Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.217923 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.217905163 podStartE2EDuration="2.217905163s" podCreationTimestamp="2025-10-06 08:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:19.206501382 +0000 UTC m=+1136.035816596" watchObservedRunningTime="2025-10-06 08:41:19.217905163 +0000 UTC m=+1136.047220377" Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.459530 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.460223 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.465225 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:19 crc kubenswrapper[4755]: I1006 08:41:19.476334 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.202620 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.206679 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.386393 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.388401 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.399496 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.476428 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfq5\" (UniqueName: \"kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.476890 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.477254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.477302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.477402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.578817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.578936 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.578962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.578991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.579042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfq5\" (UniqueName: \"kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.580007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.580034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.580013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.580155 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.598635 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfq5\" (UniqueName: \"kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5\") pod \"dnsmasq-dns-5b856c5697-lpr8r\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:20 crc kubenswrapper[4755]: I1006 08:41:20.719228 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:21 crc kubenswrapper[4755]: I1006 08:41:21.176200 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:41:21 crc kubenswrapper[4755]: I1006 08:41:21.211194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" event={"ID":"6bcd099d-2fe7-4237-9338-e7a9aefc1dec","Type":"ContainerStarted","Data":"568c0bc1fb5d50c7b1973b3166bb09f9e4a704df085256b5b7b2f967b00cb582"} Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.223426 4755 generic.go:334] "Generic (PLEG): container finished" podID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerID="6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9" exitCode=0 Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.225170 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" event={"ID":"6bcd099d-2fe7-4237-9338-e7a9aefc1dec","Type":"ContainerDied","Data":"6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9"} Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.523388 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.524020 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-central-agent" containerID="cri-o://4031ca26b25c3fa4fe7efa088d8541608f19ba99fdb5e3a93d2e03acde1582b6" gracePeriod=30 Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.524512 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="proxy-httpd" containerID="cri-o://977b3dbaa3801356a9cef6ba013b8c9187342f9f407f6ad4427e214af08ea598" gracePeriod=30 Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.524611 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="sg-core" containerID="cri-o://a29021b326428b7a6a7336f78794803385c35f381705af46ce931b01ac90ca9b" gracePeriod=30 Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.524667 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-notification-agent" containerID="cri-o://520aaa97134755f260b7a1fafad978d709e978280fa4d7d2d417d181da9efb6b" gracePeriod=30 Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.858047 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:22 crc kubenswrapper[4755]: I1006 08:41:22.908410 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.245141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" event={"ID":"6bcd099d-2fe7-4237-9338-e7a9aefc1dec","Type":"ContainerStarted","Data":"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398"} Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.247025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252425 4755 generic.go:334] "Generic (PLEG): container finished" podID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerID="977b3dbaa3801356a9cef6ba013b8c9187342f9f407f6ad4427e214af08ea598" exitCode=0 Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252459 4755 generic.go:334] "Generic (PLEG): container finished" podID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerID="a29021b326428b7a6a7336f78794803385c35f381705af46ce931b01ac90ca9b" exitCode=2 Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252466 4755 generic.go:334] "Generic (PLEG): container finished" podID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerID="4031ca26b25c3fa4fe7efa088d8541608f19ba99fdb5e3a93d2e03acde1582b6" exitCode=0 Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252510 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerDied","Data":"977b3dbaa3801356a9cef6ba013b8c9187342f9f407f6ad4427e214af08ea598"} Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252657 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerDied","Data":"a29021b326428b7a6a7336f78794803385c35f381705af46ce931b01ac90ca9b"} Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerDied","Data":"4031ca26b25c3fa4fe7efa088d8541608f19ba99fdb5e3a93d2e03acde1582b6"} Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252791 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-log" containerID="cri-o://5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37" gracePeriod=30 Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.252936 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-api" containerID="cri-o://951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563" gracePeriod=30 Oct 06 08:41:23 crc kubenswrapper[4755]: I1006 08:41:23.272179 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" podStartSLOduration=3.272158914 podStartE2EDuration="3.272158914s" podCreationTimestamp="2025-10-06 08:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:23.264870767 +0000 UTC m=+1140.094185991" watchObservedRunningTime="2025-10-06 08:41:23.272158914 +0000 UTC m=+1140.101474148" Oct 06 08:41:24 crc kubenswrapper[4755]: I1006 08:41:24.262308 4755 generic.go:334] "Generic (PLEG): container finished" podID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerID="5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37" exitCode=143 Oct 06 08:41:24 crc kubenswrapper[4755]: I1006 08:41:24.262362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerDied","Data":"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37"} Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.274718 4755 generic.go:334] "Generic (PLEG): container finished" podID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerID="520aaa97134755f260b7a1fafad978d709e978280fa4d7d2d417d181da9efb6b" exitCode=0 Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.274767 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerDied","Data":"520aaa97134755f260b7a1fafad978d709e978280fa4d7d2d417d181da9efb6b"} Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.613903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682699 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltv6v\" (UniqueName: \"kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682846 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.682984 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd\") pod \"87511693-db15-4fef-b3f8-a48e99ddfb0b\" (UID: \"87511693-db15-4fef-b3f8-a48e99ddfb0b\") " Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.683856 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.684031 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.688815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v" (OuterVolumeSpecName: "kube-api-access-ltv6v") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "kube-api-access-ltv6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.689765 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts" (OuterVolumeSpecName: "scripts") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.714525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.750783 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.759999 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784833 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784868 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784878 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784886 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784895 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784906 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltv6v\" (UniqueName: \"kubernetes.io/projected/87511693-db15-4fef-b3f8-a48e99ddfb0b-kube-api-access-ltv6v\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.784914 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87511693-db15-4fef-b3f8-a48e99ddfb0b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.787927 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data" (OuterVolumeSpecName: "config-data") pod "87511693-db15-4fef-b3f8-a48e99ddfb0b" (UID: "87511693-db15-4fef-b3f8-a48e99ddfb0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:25 crc kubenswrapper[4755]: I1006 08:41:25.886709 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87511693-db15-4fef-b3f8-a48e99ddfb0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.286831 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87511693-db15-4fef-b3f8-a48e99ddfb0b","Type":"ContainerDied","Data":"368fcee0b92826739d697260d0674f6fbdfe233de321437487511307c9d8ef9e"} Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.286892 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.287184 4755 scope.go:117] "RemoveContainer" containerID="977b3dbaa3801356a9cef6ba013b8c9187342f9f407f6ad4427e214af08ea598" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.310130 4755 scope.go:117] "RemoveContainer" containerID="a29021b326428b7a6a7336f78794803385c35f381705af46ce931b01ac90ca9b" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.317605 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.329081 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.346488 4755 scope.go:117] "RemoveContainer" containerID="520aaa97134755f260b7a1fafad978d709e978280fa4d7d2d417d181da9efb6b" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.352858 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:26 crc kubenswrapper[4755]: E1006 08:41:26.353667 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-notification-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.353692 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-notification-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: E1006 08:41:26.353728 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="sg-core" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.353738 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="sg-core" Oct 06 08:41:26 crc kubenswrapper[4755]: E1006 08:41:26.353761 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-central-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.353768 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-central-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: E1006 08:41:26.353789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="proxy-httpd" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.353797 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="proxy-httpd" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.354181 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="sg-core" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.354209 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-central-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.354233 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="ceilometer-notification-agent" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.354257 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" containerName="proxy-httpd" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.357699 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.361117 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.361276 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.362081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.376450 4755 scope.go:117] "RemoveContainer" containerID="4031ca26b25c3fa4fe7efa088d8541608f19ba99fdb5e3a93d2e03acde1582b6" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.382605 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dfd\" (UniqueName: \"kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394813 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394919 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.394978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.395014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.395044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86dfd\" (UniqueName: \"kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.496990 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.497048 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.497084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.497614 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.497869 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.503785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.504614 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.505489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.506976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.509740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.525498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dfd\" (UniqueName: \"kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd\") pod \"ceilometer-0\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.686930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.768824 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.801350 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flqh6\" (UniqueName: \"kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6\") pod \"6775ab5f-5b0b-4853-a743-37e2070c5419\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.801436 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs\") pod \"6775ab5f-5b0b-4853-a743-37e2070c5419\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.801505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data\") pod \"6775ab5f-5b0b-4853-a743-37e2070c5419\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.801525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle\") pod \"6775ab5f-5b0b-4853-a743-37e2070c5419\" (UID: \"6775ab5f-5b0b-4853-a743-37e2070c5419\") " Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.802096 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs" (OuterVolumeSpecName: "logs") pod "6775ab5f-5b0b-4853-a743-37e2070c5419" (UID: "6775ab5f-5b0b-4853-a743-37e2070c5419"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.810869 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6" (OuterVolumeSpecName: "kube-api-access-flqh6") pod "6775ab5f-5b0b-4853-a743-37e2070c5419" (UID: "6775ab5f-5b0b-4853-a743-37e2070c5419"). InnerVolumeSpecName "kube-api-access-flqh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.841789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data" (OuterVolumeSpecName: "config-data") pod "6775ab5f-5b0b-4853-a743-37e2070c5419" (UID: "6775ab5f-5b0b-4853-a743-37e2070c5419"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.851736 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6775ab5f-5b0b-4853-a743-37e2070c5419" (UID: "6775ab5f-5b0b-4853-a743-37e2070c5419"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.903530 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flqh6\" (UniqueName: \"kubernetes.io/projected/6775ab5f-5b0b-4853-a743-37e2070c5419-kube-api-access-flqh6\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.903558 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6775ab5f-5b0b-4853-a743-37e2070c5419-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.903584 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:26 crc kubenswrapper[4755]: I1006 08:41:26.903597 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6775ab5f-5b0b-4853-a743-37e2070c5419-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:27 crc kubenswrapper[4755]: W1006 08:41:27.202041 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ae6bbc1_632c_4769_9fcf_b7689df07c49.slice/crio-c45253b3ae45fd1897679d748a79c3679c864244dbe43e86ce67d7c4782464a9 WatchSource:0}: Error finding container c45253b3ae45fd1897679d748a79c3679c864244dbe43e86ce67d7c4782464a9: Status 404 returned error can't find the container with id c45253b3ae45fd1897679d748a79c3679c864244dbe43e86ce67d7c4782464a9 Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.205355 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.308344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerStarted","Data":"c45253b3ae45fd1897679d748a79c3679c864244dbe43e86ce67d7c4782464a9"} Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.311495 4755 generic.go:334] "Generic (PLEG): container finished" podID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerID="951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563" exitCode=0 Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.311523 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerDied","Data":"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563"} Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.311539 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6775ab5f-5b0b-4853-a743-37e2070c5419","Type":"ContainerDied","Data":"5fe15638b1a60cfb0d01c63a27fb33c642daa1a375eb96b43da44c1ecebc2943"} Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.311555 4755 scope.go:117] "RemoveContainer" containerID="951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.311649 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.343329 4755 scope.go:117] "RemoveContainer" containerID="5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.347583 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.373890 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.397022 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:27 crc kubenswrapper[4755]: E1006 08:41:27.397461 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-log" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.397477 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-log" Oct 06 08:41:27 crc kubenswrapper[4755]: E1006 08:41:27.397494 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-api" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.397502 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-api" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.397757 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-api" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.397791 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" containerName="nova-api-log" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.399425 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.401823 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.402008 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.402127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.404111 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.407322 4755 scope.go:117] "RemoveContainer" containerID="951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563" Oct 06 08:41:27 crc kubenswrapper[4755]: E1006 08:41:27.407713 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563\": container with ID starting with 951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563 not found: ID does not exist" containerID="951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.407749 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563"} err="failed to get container status \"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563\": rpc error: code = NotFound desc = could not find container \"951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563\": container with ID starting with 951b5cf926b19c23d46ed6c2d5f2200f5d4df155fac885177ed6b90d62314563 not found: ID does not exist" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.407771 4755 scope.go:117] "RemoveContainer" containerID="5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37" Oct 06 08:41:27 crc kubenswrapper[4755]: E1006 08:41:27.408189 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37\": container with ID starting with 5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37 not found: ID does not exist" containerID="5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.408212 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37"} err="failed to get container status \"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37\": rpc error: code = NotFound desc = could not find container \"5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37\": container with ID starting with 5e05655ffdc430e06918543aa3e4bcf7c45690487f5da46ee12ba50e466e3c37 not found: ID does not exist" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.513866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.513963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.513998 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.514053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.514124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.514161 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzj4\" (UniqueName: \"kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.615920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.616229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzj4\" (UniqueName: \"kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.616394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.616516 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.616617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.616718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.617303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.624012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.624626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.624669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.624999 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.637817 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzj4\" (UniqueName: \"kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4\") pod \"nova-api-0\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.719159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.858878 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.895119 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6775ab5f-5b0b-4853-a743-37e2070c5419" path="/var/lib/kubelet/pods/6775ab5f-5b0b-4853-a743-37e2070c5419/volumes" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.896034 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87511693-db15-4fef-b3f8-a48e99ddfb0b" path="/var/lib/kubelet/pods/87511693-db15-4fef-b3f8-a48e99ddfb0b/volumes" Oct 06 08:41:27 crc kubenswrapper[4755]: I1006 08:41:27.896819 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.184624 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:28 crc kubenswrapper[4755]: W1006 08:41:28.191946 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08787948_f66a_4c1c_87b8_7cacd9a3d96b.slice/crio-23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73 WatchSource:0}: Error finding container 23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73: Status 404 returned error can't find the container with id 23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73 Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.320797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerStarted","Data":"23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73"} Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.325482 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerStarted","Data":"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998"} Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.356928 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.554215 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dwgtj"] Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.555982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.558227 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.558642 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.562355 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dwgtj"] Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.643407 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65s7z\" (UniqueName: \"kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.643531 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.643654 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.643679 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.745591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65s7z\" (UniqueName: \"kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.745683 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.745716 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.745743 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.750263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.751099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.752404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.762021 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65s7z\" (UniqueName: \"kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z\") pod \"nova-cell1-cell-mapping-dwgtj\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:28 crc kubenswrapper[4755]: I1006 08:41:28.925323 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:29 crc kubenswrapper[4755]: I1006 08:41:29.338863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerStarted","Data":"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35"} Oct 06 08:41:29 crc kubenswrapper[4755]: I1006 08:41:29.344367 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerStarted","Data":"a47f518b5144094972f3f9136558e17d0941fb5862486077ff04c1d479eb8d5a"} Oct 06 08:41:29 crc kubenswrapper[4755]: I1006 08:41:29.344421 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerStarted","Data":"b3ddb244271eadc85df60c48c578f543691e5bf8d1f44a978080f2e2f4faff18"} Oct 06 08:41:29 crc kubenswrapper[4755]: I1006 08:41:29.393356 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.393336901 podStartE2EDuration="2.393336901s" podCreationTimestamp="2025-10-06 08:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:29.367937923 +0000 UTC m=+1146.197253137" watchObservedRunningTime="2025-10-06 08:41:29.393336901 +0000 UTC m=+1146.222652115" Oct 06 08:41:29 crc kubenswrapper[4755]: I1006 08:41:29.399707 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dwgtj"] Oct 06 08:41:29 crc kubenswrapper[4755]: W1006 08:41:29.416881 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ffe843e_b0c1_40ab_a1ad_d412b46c03e3.slice/crio-ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882 WatchSource:0}: Error finding container ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882: Status 404 returned error can't find the container with id ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882 Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.365357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerStarted","Data":"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8"} Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.366456 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dwgtj" event={"ID":"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3","Type":"ContainerStarted","Data":"b211b2176638185955f3684109429af42717d7e2754bc90bbe094b52115dd34f"} Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.366497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dwgtj" event={"ID":"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3","Type":"ContainerStarted","Data":"ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882"} Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.721712 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.747229 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dwgtj" podStartSLOduration=2.747208907 podStartE2EDuration="2.747208907s" podCreationTimestamp="2025-10-06 08:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:30.392943835 +0000 UTC m=+1147.222259079" watchObservedRunningTime="2025-10-06 08:41:30.747208907 +0000 UTC m=+1147.576524121" Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.796942 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:41:30 crc kubenswrapper[4755]: I1006 08:41:30.797155 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="dnsmasq-dns" containerID="cri-o://a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9" gracePeriod=10 Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.317082 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.378428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerStarted","Data":"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912"} Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.378550 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.379890 4755 generic.go:334] "Generic (PLEG): container finished" podID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerID="a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9" exitCode=0 Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.380044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" event={"ID":"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a","Type":"ContainerDied","Data":"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9"} Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.380106 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" event={"ID":"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a","Type":"ContainerDied","Data":"f9e621997d2dbbfedb619af74a36de0a3d5a0cb52c995454acca54e04a60681d"} Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.380128 4755 scope.go:117] "RemoveContainer" containerID="a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.380066 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-rm2ld" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.401578 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dvc8\" (UniqueName: \"kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8\") pod \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.401641 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb\") pod \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.401695 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config\") pod \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.401854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb\") pod \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.401897 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc\") pod \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\" (UID: \"4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a\") " Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.402636 4755 scope.go:117] "RemoveContainer" containerID="ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.406874 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8" (OuterVolumeSpecName: "kube-api-access-4dvc8") pod "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" (UID: "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a"). InnerVolumeSpecName "kube-api-access-4dvc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.419513 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.636176081 podStartE2EDuration="5.419493297s" podCreationTimestamp="2025-10-06 08:41:26 +0000 UTC" firstStartedPulling="2025-10-06 08:41:27.204455292 +0000 UTC m=+1144.033770506" lastFinishedPulling="2025-10-06 08:41:30.987772508 +0000 UTC m=+1147.817087722" observedRunningTime="2025-10-06 08:41:31.409869222 +0000 UTC m=+1148.239184436" watchObservedRunningTime="2025-10-06 08:41:31.419493297 +0000 UTC m=+1148.248808511" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.445713 4755 scope.go:117] "RemoveContainer" containerID="a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9" Oct 06 08:41:31 crc kubenswrapper[4755]: E1006 08:41:31.449708 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9\": container with ID starting with a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9 not found: ID does not exist" containerID="a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.449878 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9"} err="failed to get container status \"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9\": rpc error: code = NotFound desc = could not find container \"a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9\": container with ID starting with a06c02377a226827f0e196db5b86adf81004c0aed0ba5818dc7b6977306028c9 not found: ID does not exist" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.449980 4755 scope.go:117] "RemoveContainer" containerID="ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9" Oct 06 08:41:31 crc kubenswrapper[4755]: E1006 08:41:31.450870 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9\": container with ID starting with ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9 not found: ID does not exist" containerID="ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.450937 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9"} err="failed to get container status \"ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9\": rpc error: code = NotFound desc = could not find container \"ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9\": container with ID starting with ab9ef6cbb44056ec6d7941f58e46bb60db3c5ca7bd98320dae22b8f2534c0ff9 not found: ID does not exist" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.459050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" (UID: "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.473968 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" (UID: "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.479157 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config" (OuterVolumeSpecName: "config") pod "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" (UID: "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.484289 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" (UID: "4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.505789 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.505831 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.505842 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dvc8\" (UniqueName: \"kubernetes.io/projected/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-kube-api-access-4dvc8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.505852 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.505861 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.713039 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.722371 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-rm2ld"] Oct 06 08:41:31 crc kubenswrapper[4755]: I1006 08:41:31.899770 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" path="/var/lib/kubelet/pods/4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a/volumes" Oct 06 08:41:35 crc kubenswrapper[4755]: I1006 08:41:35.418653 4755 generic.go:334] "Generic (PLEG): container finished" podID="9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" containerID="b211b2176638185955f3684109429af42717d7e2754bc90bbe094b52115dd34f" exitCode=0 Oct 06 08:41:35 crc kubenswrapper[4755]: I1006 08:41:35.418762 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dwgtj" event={"ID":"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3","Type":"ContainerDied","Data":"b211b2176638185955f3684109429af42717d7e2754bc90bbe094b52115dd34f"} Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.779801 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.908846 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data\") pod \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.908967 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle\") pod \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.909001 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts\") pod \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.909129 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65s7z\" (UniqueName: \"kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z\") pod \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\" (UID: \"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3\") " Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.915481 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z" (OuterVolumeSpecName: "kube-api-access-65s7z") pod "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" (UID: "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3"). InnerVolumeSpecName "kube-api-access-65s7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.916456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts" (OuterVolumeSpecName: "scripts") pod "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" (UID: "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.935832 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data" (OuterVolumeSpecName: "config-data") pod "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" (UID: "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:36 crc kubenswrapper[4755]: I1006 08:41:36.941304 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" (UID: "9ffe843e-b0c1-40ab-a1ad-d412b46c03e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.011736 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.011763 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.011827 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65s7z\" (UniqueName: \"kubernetes.io/projected/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-kube-api-access-65s7z\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.011873 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.444845 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dwgtj" event={"ID":"9ffe843e-b0c1-40ab-a1ad-d412b46c03e3","Type":"ContainerDied","Data":"ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882"} Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.444906 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec50a3e0e5190a3ffd99bfd1b9c4119e76ace7c24d39a1cf5d0646dea68f5882" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.445369 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dwgtj" Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.640444 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.641329 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" containerName="nova-scheduler-scheduler" containerID="cri-o://c659f6a071b4d2c7eac56c39d72dae7e4437f76987b984a342bd50e39835573b" gracePeriod=30 Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.656835 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.657668 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-log" containerID="cri-o://b3ddb244271eadc85df60c48c578f543691e5bf8d1f44a978080f2e2f4faff18" gracePeriod=30 Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.657701 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-api" containerID="cri-o://a47f518b5144094972f3f9136558e17d0941fb5862486077ff04c1d479eb8d5a" gracePeriod=30 Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.731981 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.732480 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" containerID="cri-o://35c3a15be3296878f4ac01a36c97a4ee83bffb62f920f88b9ecfe54edb3719a8" gracePeriod=30 Oct 06 08:41:37 crc kubenswrapper[4755]: I1006 08:41:37.732754 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" containerID="cri-o://cb9d2ce7c5f4e55395eb161409704dc26c35093116bb43f11634f9025660af46" gracePeriod=30 Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.458535 4755 generic.go:334] "Generic (PLEG): container finished" podID="7e657415-bec0-4087-936c-aee106e6e624" containerID="35c3a15be3296878f4ac01a36c97a4ee83bffb62f920f88b9ecfe54edb3719a8" exitCode=143 Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.458595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerDied","Data":"35c3a15be3296878f4ac01a36c97a4ee83bffb62f920f88b9ecfe54edb3719a8"} Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.461047 4755 generic.go:334] "Generic (PLEG): container finished" podID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" containerID="c659f6a071b4d2c7eac56c39d72dae7e4437f76987b984a342bd50e39835573b" exitCode=0 Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.461096 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b4fe16f-996e-4563-9d52-0d324aad3eb5","Type":"ContainerDied","Data":"c659f6a071b4d2c7eac56c39d72dae7e4437f76987b984a342bd50e39835573b"} Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463265 4755 generic.go:334] "Generic (PLEG): container finished" podID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerID="a47f518b5144094972f3f9136558e17d0941fb5862486077ff04c1d479eb8d5a" exitCode=0 Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463279 4755 generic.go:334] "Generic (PLEG): container finished" podID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerID="b3ddb244271eadc85df60c48c578f543691e5bf8d1f44a978080f2e2f4faff18" exitCode=143 Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerDied","Data":"a47f518b5144094972f3f9136558e17d0941fb5862486077ff04c1d479eb8d5a"} Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerDied","Data":"b3ddb244271eadc85df60c48c578f543691e5bf8d1f44a978080f2e2f4faff18"} Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08787948-f66a-4c1c-87b8-7cacd9a3d96b","Type":"ContainerDied","Data":"23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73"} Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.463324 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23507d0c7b2532b7054758036fca875b29a3555bfa6d20a08bbab85c9fbe2e73" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.469508 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541552 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541649 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541708 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzzj4\" (UniqueName: \"kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.541880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle\") pod \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\" (UID: \"08787948-f66a-4c1c-87b8-7cacd9a3d96b\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.542129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs" (OuterVolumeSpecName: "logs") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.542546 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08787948-f66a-4c1c-87b8-7cacd9a3d96b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.547680 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4" (OuterVolumeSpecName: "kube-api-access-nzzj4") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "kube-api-access-nzzj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.566772 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.567380 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.584317 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data" (OuterVolumeSpecName: "config-data") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.607785 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.611521 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08787948-f66a-4c1c-87b8-7cacd9a3d96b" (UID: "08787948-f66a-4c1c-87b8-7cacd9a3d96b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.643820 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data\") pod \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.643867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g2t8\" (UniqueName: \"kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8\") pod \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.643911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle\") pod \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\" (UID: \"8b4fe16f-996e-4563-9d52-0d324aad3eb5\") " Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.644334 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.644351 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.644359 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.644368 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08787948-f66a-4c1c-87b8-7cacd9a3d96b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.644376 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzzj4\" (UniqueName: \"kubernetes.io/projected/08787948-f66a-4c1c-87b8-7cacd9a3d96b-kube-api-access-nzzj4\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.647639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8" (OuterVolumeSpecName: "kube-api-access-4g2t8") pod "8b4fe16f-996e-4563-9d52-0d324aad3eb5" (UID: "8b4fe16f-996e-4563-9d52-0d324aad3eb5"). InnerVolumeSpecName "kube-api-access-4g2t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.666778 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data" (OuterVolumeSpecName: "config-data") pod "8b4fe16f-996e-4563-9d52-0d324aad3eb5" (UID: "8b4fe16f-996e-4563-9d52-0d324aad3eb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.667965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b4fe16f-996e-4563-9d52-0d324aad3eb5" (UID: "8b4fe16f-996e-4563-9d52-0d324aad3eb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.746067 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.746100 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g2t8\" (UniqueName: \"kubernetes.io/projected/8b4fe16f-996e-4563-9d52-0d324aad3eb5-kube-api-access-4g2t8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:38 crc kubenswrapper[4755]: I1006 08:41:38.746111 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4fe16f-996e-4563-9d52-0d324aad3eb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.472054 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b4fe16f-996e-4563-9d52-0d324aad3eb5","Type":"ContainerDied","Data":"202423b1103284de6c91a61ccce3644dc9c95f0d39edcce8c3fc8c3ab5c44064"} Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.472092 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.472100 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.472111 4755 scope.go:117] "RemoveContainer" containerID="c659f6a071b4d2c7eac56c39d72dae7e4437f76987b984a342bd50e39835573b" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.504105 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.515880 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.523259 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.531012 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.540790 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541306 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" containerName="nova-scheduler-scheduler" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541329 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" containerName="nova-scheduler-scheduler" Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541343 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-api" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541350 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-api" Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541365 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="dnsmasq-dns" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541372 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="dnsmasq-dns" Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541386 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" containerName="nova-manage" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541393 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" containerName="nova-manage" Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541416 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-log" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541422 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-log" Oct 06 08:41:39 crc kubenswrapper[4755]: E1006 08:41:39.541441 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="init" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541448 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="init" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541621 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" containerName="nova-manage" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541632 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4568d3a9-e2e7-4b90-ae90-dd3d8135cb7a" containerName="dnsmasq-dns" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541645 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" containerName="nova-scheduler-scheduler" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541662 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-log" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.541669 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" containerName="nova-api-api" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.543496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.550021 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.551637 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.561963 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.562084 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.562125 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.562286 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.562501 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.569767 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krfp\" (UniqueName: \"kubernetes.io/projected/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-kube-api-access-9krfp\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676399 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-logs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676437 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-config-data\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676479 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676665 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fx4\" (UniqueName: \"kubernetes.io/projected/e27e627a-4f73-40cb-8ca0-e01c190266d3-kube-api-access-v7fx4\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-config-data\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676886 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.676977 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fx4\" (UniqueName: \"kubernetes.io/projected/e27e627a-4f73-40cb-8ca0-e01c190266d3-kube-api-access-v7fx4\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-config-data\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779122 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779154 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krfp\" (UniqueName: \"kubernetes.io/projected/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-kube-api-access-9krfp\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.779994 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-logs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.780027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-config-data\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.780078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.781669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-logs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.783487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.783922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-config-data\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.783962 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27e627a-4f73-40cb-8ca0-e01c190266d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.785019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-config-data\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.793219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.794769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.797437 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fx4\" (UniqueName: \"kubernetes.io/projected/e27e627a-4f73-40cb-8ca0-e01c190266d3-kube-api-access-v7fx4\") pod \"nova-scheduler-0\" (UID: \"e27e627a-4f73-40cb-8ca0-e01c190266d3\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.797728 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krfp\" (UniqueName: \"kubernetes.io/projected/0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb-kube-api-access-9krfp\") pod \"nova-api-0\" (UID: \"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb\") " pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.886914 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.901797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08787948-f66a-4c1c-87b8-7cacd9a3d96b" path="/var/lib/kubelet/pods/08787948-f66a-4c1c-87b8-7cacd9a3d96b/volumes" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.902392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:39 crc kubenswrapper[4755]: I1006 08:41:39.902986 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b4fe16f-996e-4563-9d52-0d324aad3eb5" path="/var/lib/kubelet/pods/8b4fe16f-996e-4563-9d52-0d324aad3eb5/volumes" Oct 06 08:41:40 crc kubenswrapper[4755]: I1006 08:41:40.370765 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:40 crc kubenswrapper[4755]: W1006 08:41:40.373475 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c3b9ddb_5ed7_46cd_a389_09f1ff73c8eb.slice/crio-f5c4d2f7a9d68caa083e09d5a47904dcbb61e3c1dd34ae41c9488db0b5802383 WatchSource:0}: Error finding container f5c4d2f7a9d68caa083e09d5a47904dcbb61e3c1dd34ae41c9488db0b5802383: Status 404 returned error can't find the container with id f5c4d2f7a9d68caa083e09d5a47904dcbb61e3c1dd34ae41c9488db0b5802383 Oct 06 08:41:40 crc kubenswrapper[4755]: I1006 08:41:40.381957 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:40 crc kubenswrapper[4755]: I1006 08:41:40.482189 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e27e627a-4f73-40cb-8ca0-e01c190266d3","Type":"ContainerStarted","Data":"0ef440114404433b4e200aa8e41a91d8cf903219d35956b7cb1973a668cc73ba"} Oct 06 08:41:40 crc kubenswrapper[4755]: I1006 08:41:40.483726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb","Type":"ContainerStarted","Data":"f5c4d2f7a9d68caa083e09d5a47904dcbb61e3c1dd34ae41c9488db0b5802383"} Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.209661 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": read tcp 10.217.0.2:46440->10.217.0.179:8775: read: connection reset by peer" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.209820 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": read tcp 10.217.0.2:46446->10.217.0.179:8775: read: connection reset by peer" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.503695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb","Type":"ContainerStarted","Data":"5ef67423a23671fd2b583db80f62b821c43778e0eba8a7a6de6a3ae0b3ce825b"} Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.503765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb","Type":"ContainerStarted","Data":"7055323594f1560f9b093d8e150c1d3b8c1b739f434ae54aa02267b7dbb9ef8e"} Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.511632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e27e627a-4f73-40cb-8ca0-e01c190266d3","Type":"ContainerStarted","Data":"92f8cc7a42c0157c46642eec8349e87ea51ff046ea9b74a06c270212e73e2637"} Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.515202 4755 generic.go:334] "Generic (PLEG): container finished" podID="7e657415-bec0-4087-936c-aee106e6e624" containerID="cb9d2ce7c5f4e55395eb161409704dc26c35093116bb43f11634f9025660af46" exitCode=0 Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.515245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerDied","Data":"cb9d2ce7c5f4e55395eb161409704dc26c35093116bb43f11634f9025660af46"} Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.533550 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.533523368 podStartE2EDuration="2.533523368s" podCreationTimestamp="2025-10-06 08:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:41.531977779 +0000 UTC m=+1158.361292993" watchObservedRunningTime="2025-10-06 08:41:41.533523368 +0000 UTC m=+1158.362838582" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.555685 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.555667734 podStartE2EDuration="2.555667734s" podCreationTimestamp="2025-10-06 08:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:41.552820561 +0000 UTC m=+1158.382135765" watchObservedRunningTime="2025-10-06 08:41:41.555667734 +0000 UTC m=+1158.384982938" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.605429 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.722120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle\") pod \"7e657415-bec0-4087-936c-aee106e6e624\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.722198 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs\") pod \"7e657415-bec0-4087-936c-aee106e6e624\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.722232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data\") pod \"7e657415-bec0-4087-936c-aee106e6e624\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.722249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs\") pod \"7e657415-bec0-4087-936c-aee106e6e624\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.722289 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99442\" (UniqueName: \"kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442\") pod \"7e657415-bec0-4087-936c-aee106e6e624\" (UID: \"7e657415-bec0-4087-936c-aee106e6e624\") " Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.723150 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs" (OuterVolumeSpecName: "logs") pod "7e657415-bec0-4087-936c-aee106e6e624" (UID: "7e657415-bec0-4087-936c-aee106e6e624"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.728107 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442" (OuterVolumeSpecName: "kube-api-access-99442") pod "7e657415-bec0-4087-936c-aee106e6e624" (UID: "7e657415-bec0-4087-936c-aee106e6e624"). InnerVolumeSpecName "kube-api-access-99442". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.751310 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data" (OuterVolumeSpecName: "config-data") pod "7e657415-bec0-4087-936c-aee106e6e624" (UID: "7e657415-bec0-4087-936c-aee106e6e624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.753486 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e657415-bec0-4087-936c-aee106e6e624" (UID: "7e657415-bec0-4087-936c-aee106e6e624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.795801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7e657415-bec0-4087-936c-aee106e6e624" (UID: "7e657415-bec0-4087-936c-aee106e6e624"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.824151 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.824183 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e657415-bec0-4087-936c-aee106e6e624-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.824194 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.824204 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e657415-bec0-4087-936c-aee106e6e624-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:41 crc kubenswrapper[4755]: I1006 08:41:41.824214 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99442\" (UniqueName: \"kubernetes.io/projected/7e657415-bec0-4087-936c-aee106e6e624-kube-api-access-99442\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.530028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e657415-bec0-4087-936c-aee106e6e624","Type":"ContainerDied","Data":"641c833398c50cdae9a9374e766c5ac9446b5d1bb2c5e895baa5659f722072bd"} Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.530111 4755 scope.go:117] "RemoveContainer" containerID="cb9d2ce7c5f4e55395eb161409704dc26c35093116bb43f11634f9025660af46" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.530326 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.562553 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.575031 4755 scope.go:117] "RemoveContainer" containerID="35c3a15be3296878f4ac01a36c97a4ee83bffb62f920f88b9ecfe54edb3719a8" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.581395 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.591928 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:42 crc kubenswrapper[4755]: E1006 08:41:42.592583 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.592611 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" Oct 06 08:41:42 crc kubenswrapper[4755]: E1006 08:41:42.592646 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.592668 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.592940 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-log" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.592960 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e657415-bec0-4087-936c-aee106e6e624" containerName="nova-metadata-metadata" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.594288 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.596767 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.597982 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.609196 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.740774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-logs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.740822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.740909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.740946 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prlt\" (UniqueName: \"kubernetes.io/projected/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-kube-api-access-8prlt\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.740995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-config-data\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.842812 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-logs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.842899 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.842979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.843012 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prlt\" (UniqueName: \"kubernetes.io/projected/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-kube-api-access-8prlt\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.843077 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-config-data\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.843302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-logs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.846910 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-config-data\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.846928 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.847244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.866933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prlt\" (UniqueName: \"kubernetes.io/projected/d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9-kube-api-access-8prlt\") pod \"nova-metadata-0\" (UID: \"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9\") " pod="openstack/nova-metadata-0" Oct 06 08:41:42 crc kubenswrapper[4755]: I1006 08:41:42.925763 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:43 crc kubenswrapper[4755]: I1006 08:41:43.357023 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:43 crc kubenswrapper[4755]: I1006 08:41:43.541098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9","Type":"ContainerStarted","Data":"a87ce3531000a65ddd21e37cf07ade4510f39058b1a0660b651e3f04571309d3"} Oct 06 08:41:43 crc kubenswrapper[4755]: I1006 08:41:43.541419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9","Type":"ContainerStarted","Data":"72bb3ecfddbe939d231f85a23beb51c4d51243ad854c52fb44cf704ee336df08"} Oct 06 08:41:43 crc kubenswrapper[4755]: I1006 08:41:43.890519 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e657415-bec0-4087-936c-aee106e6e624" path="/var/lib/kubelet/pods/7e657415-bec0-4087-936c-aee106e6e624/volumes" Oct 06 08:41:44 crc kubenswrapper[4755]: I1006 08:41:44.549148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9","Type":"ContainerStarted","Data":"29f88bf58d767cce282a5cd87876c0272581534c85dd0c7e0e4451543dbada8b"} Oct 06 08:41:44 crc kubenswrapper[4755]: I1006 08:41:44.902757 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:41:47 crc kubenswrapper[4755]: I1006 08:41:47.926135 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:47 crc kubenswrapper[4755]: I1006 08:41:47.926494 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:49 crc kubenswrapper[4755]: I1006 08:41:49.890551 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:49 crc kubenswrapper[4755]: I1006 08:41:49.891724 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:49 crc kubenswrapper[4755]: I1006 08:41:49.904668 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:41:49 crc kubenswrapper[4755]: I1006 08:41:49.930283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:41:49 crc kubenswrapper[4755]: I1006 08:41:49.951371 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=7.9513478509999995 podStartE2EDuration="7.951347851s" podCreationTimestamp="2025-10-06 08:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:44.565218379 +0000 UTC m=+1161.394533613" watchObservedRunningTime="2025-10-06 08:41:49.951347851 +0000 UTC m=+1166.780663065" Oct 06 08:41:50 crc kubenswrapper[4755]: I1006 08:41:50.644842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:41:50 crc kubenswrapper[4755]: I1006 08:41:50.899736 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:50 crc kubenswrapper[4755]: I1006 08:41:50.899755 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:52 crc kubenswrapper[4755]: I1006 08:41:52.926216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:52 crc kubenswrapper[4755]: I1006 08:41:52.927764 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:53 crc kubenswrapper[4755]: I1006 08:41:53.939769 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:53 crc kubenswrapper[4755]: I1006 08:41:53.939807 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:56 crc kubenswrapper[4755]: I1006 08:41:56.695839 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.900690 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.901446 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.901854 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.901909 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.909734 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4755]: I1006 08:41:59.912162 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:42:02 crc kubenswrapper[4755]: I1006 08:42:02.932481 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:42:02 crc kubenswrapper[4755]: I1006 08:42:02.934941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:42:02 crc kubenswrapper[4755]: I1006 08:42:02.937500 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:42:03 crc kubenswrapper[4755]: I1006 08:42:03.743767 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:42:11 crc kubenswrapper[4755]: I1006 08:42:11.920753 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:12 crc kubenswrapper[4755]: I1006 08:42:12.807342 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:16 crc kubenswrapper[4755]: I1006 08:42:16.509770 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="rabbitmq" containerID="cri-o://038237b0e55a1bd0d8c875ac304255eecfd4658e5d098bb3afe46936d263c699" gracePeriod=604796 Oct 06 08:42:16 crc kubenswrapper[4755]: I1006 08:42:16.570952 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="rabbitmq" containerID="cri-o://4e783bc16a71a21f85e7a0d4edbec9cf161b9d673784b7dc6aecaaaccbaf42fd" gracePeriod=604797 Oct 06 08:42:18 crc kubenswrapper[4755]: I1006 08:42:18.911862 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:42:18 crc kubenswrapper[4755]: I1006 08:42:18.912230 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:42:22 crc kubenswrapper[4755]: I1006 08:42:22.941589 4755 generic.go:334] "Generic (PLEG): container finished" podID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerID="038237b0e55a1bd0d8c875ac304255eecfd4658e5d098bb3afe46936d263c699" exitCode=0 Oct 06 08:42:22 crc kubenswrapper[4755]: I1006 08:42:22.941763 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerDied","Data":"038237b0e55a1bd0d8c875ac304255eecfd4658e5d098bb3afe46936d263c699"} Oct 06 08:42:22 crc kubenswrapper[4755]: I1006 08:42:22.944117 4755 generic.go:334] "Generic (PLEG): container finished" podID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerID="4e783bc16a71a21f85e7a0d4edbec9cf161b9d673784b7dc6aecaaaccbaf42fd" exitCode=0 Oct 06 08:42:22 crc kubenswrapper[4755]: I1006 08:42:22.944149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerDied","Data":"4e783bc16a71a21f85e7a0d4edbec9cf161b9d673784b7dc6aecaaaccbaf42fd"} Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.101933 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.169530 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.303416 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.303813 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.303902 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304050 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6t7j\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304127 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304190 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304278 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304350 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786md\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304584 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304662 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304751 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305158 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305337 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305463 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info\") pod \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\" (UID: \"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305546 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.305666 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data\") pod \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\" (UID: \"3d5d33a7-9480-466b-abb7-e8fc7cf08776\") " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.304446 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.307340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.311019 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.311318 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.314807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.315477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.316528 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.316668 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j" (OuterVolumeSpecName: "kube-api-access-f6t7j") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "kube-api-access-f6t7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.318122 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md" (OuterVolumeSpecName: "kube-api-access-786md") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "kube-api-access-786md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.322545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.324842 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info" (OuterVolumeSpecName: "pod-info") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.324953 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.326822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.326929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.334671 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info" (OuterVolumeSpecName: "pod-info") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.349298 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.350439 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data" (OuterVolumeSpecName: "config-data") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.360030 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data" (OuterVolumeSpecName: "config-data") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.380625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf" (OuterVolumeSpecName: "server-conf") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.381206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf" (OuterVolumeSpecName: "server-conf") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409472 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786md\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-kube-api-access-786md\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409508 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409517 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409528 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409537 4755 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5d33a7-9480-466b-abb7-e8fc7cf08776-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409546 4755 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409555 4755 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409605 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409621 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409632 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409640 4755 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409649 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409658 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409666 4755 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409674 4755 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409684 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409696 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6t7j\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-kube-api-access-f6t7j\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409706 4755 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5d33a7-9480-466b-abb7-e8fc7cf08776-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409716 4755 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5d33a7-9480-466b-abb7-e8fc7cf08776-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.409724 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.428500 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.430866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" (UID: "cf0d28dc-714e-4fb4-ab1d-466d6b6ea905"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.434978 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.436360 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3d5d33a7-9480-466b-abb7-e8fc7cf08776" (UID: "3d5d33a7-9480-466b-abb7-e8fc7cf08776"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.511555 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.511595 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.511605 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5d33a7-9480-466b-abb7-e8fc7cf08776-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.511614 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.954559 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.955410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d5d33a7-9480-466b-abb7-e8fc7cf08776","Type":"ContainerDied","Data":"2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9"} Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.955447 4755 scope.go:117] "RemoveContainer" containerID="4e783bc16a71a21f85e7a0d4edbec9cf161b9d673784b7dc6aecaaaccbaf42fd" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.957729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf0d28dc-714e-4fb4-ab1d-466d6b6ea905","Type":"ContainerDied","Data":"409148eeb53fd888eaa0136472fad27bf10bd19eca9215596cf23ebf0824f1fd"} Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.957831 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.981865 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:23 crc kubenswrapper[4755]: I1006 08:42:23.983251 4755 scope.go:117] "RemoveContainer" containerID="16177d3511ed44688be5cb711444e8d032e2e2c914042e75ecd13cca11fbce6d" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.005756 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.008448 4755 scope.go:117] "RemoveContainer" containerID="038237b0e55a1bd0d8c875ac304255eecfd4658e5d098bb3afe46936d263c699" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.047552 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.053089 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.068381 4755 scope.go:117] "RemoveContainer" containerID="1633e3c5c5ebfc34e508071c9f8e1f1237359e4c454fd67af3224492420f4fbd" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.069111 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: E1006 08:42:24.069469 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.069487 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: E1006 08:42:24.069508 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="setup-container" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.069515 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="setup-container" Oct 06 08:42:24 crc kubenswrapper[4755]: E1006 08:42:24.069526 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="setup-container" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.069532 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="setup-container" Oct 06 08:42:24 crc kubenswrapper[4755]: E1006 08:42:24.069542 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.069548 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.070004 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.070023 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" containerName="rabbitmq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.070983 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.074012 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4wqc7" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.074151 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.078048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.078393 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.078481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.078664 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.079143 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.087649 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.127628 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.129015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.143868 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144130 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-47jjq" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144322 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144439 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144537 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144659 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.144775 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.153263 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224606 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224683 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jk5\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-kube-api-access-f6jk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b378698d-a5e1-4538-93e2-694516a551b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b378698d-a5e1-4538-93e2-694516a551b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224932 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.224955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b378698d-a5e1-4538-93e2-694516a551b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326644 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b378698d-a5e1-4538-93e2-694516a551b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcphm\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-kube-api-access-xcphm\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326853 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326899 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326931 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.326967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jk5\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-kube-api-access-f6jk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327093 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327264 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.327777 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.328379 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.328744 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.328904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.329170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b378698d-a5e1-4538-93e2-694516a551b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.332876 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.338117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b378698d-a5e1-4538-93e2-694516a551b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.339841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b378698d-a5e1-4538-93e2-694516a551b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.340239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.348020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jk5\" (UniqueName: \"kubernetes.io/projected/b378698d-a5e1-4538-93e2-694516a551b1-kube-api-access-f6jk5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.353290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b378698d-a5e1-4538-93e2-694516a551b1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.428929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcphm\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-kube-api-access-xcphm\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429189 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.429325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.430742 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.431122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.431413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.431486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.431759 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.431940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.434369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.434874 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.435729 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.436909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.447781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcphm\" (UniqueName: \"kubernetes.io/projected/5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6-kube-api-access-xcphm\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.454453 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.475174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6\") " pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.491175 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.911075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:24 crc kubenswrapper[4755]: I1006 08:42:24.979394 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b378698d-a5e1-4538-93e2-694516a551b1","Type":"ContainerStarted","Data":"c3eff7bd13bae088bfaf90f6c4aab6e22f11c68c4cf0ab57f749ea4fc4ee57e7"} Oct 06 08:42:25 crc kubenswrapper[4755]: I1006 08:42:25.014105 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:25 crc kubenswrapper[4755]: I1006 08:42:25.895063 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5d33a7-9480-466b-abb7-e8fc7cf08776" path="/var/lib/kubelet/pods/3d5d33a7-9480-466b-abb7-e8fc7cf08776/volumes" Oct 06 08:42:25 crc kubenswrapper[4755]: I1006 08:42:25.897284 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0d28dc-714e-4fb4-ab1d-466d6b6ea905" path="/var/lib/kubelet/pods/cf0d28dc-714e-4fb4-ab1d-466d6b6ea905/volumes" Oct 06 08:42:25 crc kubenswrapper[4755]: I1006 08:42:25.994330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6","Type":"ContainerStarted","Data":"fa607418e134dcb8f93914c97f85526b25fa901446b5094e24e589a595667ec4"} Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.762621 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.775873 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.775987 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.779038 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.886543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.886657 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9bk\" (UniqueName: \"kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.886710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.887310 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.887355 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.887387 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.989500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.989578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.989611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.989878 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.989979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.990054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9bk\" (UniqueName: \"kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.990965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.990989 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.990972 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.991199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:26 crc kubenswrapper[4755]: I1006 08:42:26.991519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:27 crc kubenswrapper[4755]: I1006 08:42:27.004309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b378698d-a5e1-4538-93e2-694516a551b1","Type":"ContainerStarted","Data":"7d9377e03af13044f5f644473f4ee0a342aad7e48c362d87f6bb7c5698598c20"} Oct 06 08:42:27 crc kubenswrapper[4755]: I1006 08:42:27.006459 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6","Type":"ContainerStarted","Data":"04eaaec6f0ec12701247b7af45aaf15ddc04c4b56b02577641172549b0711054"} Oct 06 08:42:27 crc kubenswrapper[4755]: I1006 08:42:27.015526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9bk\" (UniqueName: \"kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk\") pod \"dnsmasq-dns-6447ccbd8f-fktw7\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:27 crc kubenswrapper[4755]: I1006 08:42:27.096769 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:27 crc kubenswrapper[4755]: I1006 08:42:27.360513 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:27 crc kubenswrapper[4755]: W1006 08:42:27.369835 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c8459a9_cd78_49cb_ad57_2f5abb36a053.slice/crio-b991d0f9d2d12f93f38190ddc8bcfe0086406e48bd51f1e5c837c1135bf40813 WatchSource:0}: Error finding container b991d0f9d2d12f93f38190ddc8bcfe0086406e48bd51f1e5c837c1135bf40813: Status 404 returned error can't find the container with id b991d0f9d2d12f93f38190ddc8bcfe0086406e48bd51f1e5c837c1135bf40813 Oct 06 08:42:28 crc kubenswrapper[4755]: I1006 08:42:28.015994 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerID="fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f" exitCode=0 Oct 06 08:42:28 crc kubenswrapper[4755]: I1006 08:42:28.017727 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" event={"ID":"5c8459a9-cd78-49cb-ad57-2f5abb36a053","Type":"ContainerDied","Data":"fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f"} Oct 06 08:42:28 crc kubenswrapper[4755]: I1006 08:42:28.017762 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" event={"ID":"5c8459a9-cd78-49cb-ad57-2f5abb36a053","Type":"ContainerStarted","Data":"b991d0f9d2d12f93f38190ddc8bcfe0086406e48bd51f1e5c837c1135bf40813"} Oct 06 08:42:29 crc kubenswrapper[4755]: I1006 08:42:29.040216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" event={"ID":"5c8459a9-cd78-49cb-ad57-2f5abb36a053","Type":"ContainerStarted","Data":"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc"} Oct 06 08:42:29 crc kubenswrapper[4755]: I1006 08:42:29.044557 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:29 crc kubenswrapper[4755]: I1006 08:42:29.068171 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" podStartSLOduration=3.068140365 podStartE2EDuration="3.068140365s" podCreationTimestamp="2025-10-06 08:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:29.064725201 +0000 UTC m=+1205.894040405" watchObservedRunningTime="2025-10-06 08:42:29.068140365 +0000 UTC m=+1205.897455579" Oct 06 08:42:34 crc kubenswrapper[4755]: E1006 08:42:34.111727 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice/crio-2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0d28dc_714e_4fb4_ab1d_466d6b6ea905.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.098745 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.157439 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.157714 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="dnsmasq-dns" containerID="cri-o://cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398" gracePeriod=10 Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.290315 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.291739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.318775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395722 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.395856 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv88\" (UniqueName: \"kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.497738 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv88\" (UniqueName: \"kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.498920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.499452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.499467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.500000 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.522692 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv88\" (UniqueName: \"kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88\") pod \"dnsmasq-dns-864d5fc68c-tc5cr\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.645534 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.654091 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.811275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config\") pod \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.811627 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb\") pod \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.811736 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfq5\" (UniqueName: \"kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5\") pod \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.811781 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb\") pod \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.811857 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc\") pod \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\" (UID: \"6bcd099d-2fe7-4237-9338-e7a9aefc1dec\") " Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.824287 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5" (OuterVolumeSpecName: "kube-api-access-5qfq5") pod "6bcd099d-2fe7-4237-9338-e7a9aefc1dec" (UID: "6bcd099d-2fe7-4237-9338-e7a9aefc1dec"). InnerVolumeSpecName "kube-api-access-5qfq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.860405 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6bcd099d-2fe7-4237-9338-e7a9aefc1dec" (UID: "6bcd099d-2fe7-4237-9338-e7a9aefc1dec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.863603 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6bcd099d-2fe7-4237-9338-e7a9aefc1dec" (UID: "6bcd099d-2fe7-4237-9338-e7a9aefc1dec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.868512 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bcd099d-2fe7-4237-9338-e7a9aefc1dec" (UID: "6bcd099d-2fe7-4237-9338-e7a9aefc1dec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.872964 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config" (OuterVolumeSpecName: "config") pod "6bcd099d-2fe7-4237-9338-e7a9aefc1dec" (UID: "6bcd099d-2fe7-4237-9338-e7a9aefc1dec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.913482 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfq5\" (UniqueName: \"kubernetes.io/projected/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-kube-api-access-5qfq5\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.913507 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.913518 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.913527 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4755]: I1006 08:42:37.913536 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bcd099d-2fe7-4237-9338-e7a9aefc1dec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.144546 4755 generic.go:334] "Generic (PLEG): container finished" podID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerID="cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398" exitCode=0 Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.144622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" event={"ID":"6bcd099d-2fe7-4237-9338-e7a9aefc1dec","Type":"ContainerDied","Data":"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398"} Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.144650 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" event={"ID":"6bcd099d-2fe7-4237-9338-e7a9aefc1dec","Type":"ContainerDied","Data":"568c0bc1fb5d50c7b1973b3166bb09f9e4a704df085256b5b7b2f967b00cb582"} Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.144668 4755 scope.go:117] "RemoveContainer" containerID="cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.144679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lpr8r" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.173886 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.177727 4755 scope.go:117] "RemoveContainer" containerID="6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.181711 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lpr8r"] Oct 06 08:42:38 crc kubenswrapper[4755]: W1006 08:42:38.184757 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e4b100e_d1f9_4bed_a11a_a6d3d593cc24.slice/crio-5cfffcc8f3c81e05c522743b8635eacd659610417002ca39efbef8aefb666b5a WatchSource:0}: Error finding container 5cfffcc8f3c81e05c522743b8635eacd659610417002ca39efbef8aefb666b5a: Status 404 returned error can't find the container with id 5cfffcc8f3c81e05c522743b8635eacd659610417002ca39efbef8aefb666b5a Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.190853 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.213132 4755 scope.go:117] "RemoveContainer" containerID="cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398" Oct 06 08:42:38 crc kubenswrapper[4755]: E1006 08:42:38.213630 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398\": container with ID starting with cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398 not found: ID does not exist" containerID="cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.213664 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398"} err="failed to get container status \"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398\": rpc error: code = NotFound desc = could not find container \"cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398\": container with ID starting with cf1af80eda9cb5c3c2b2ae4f307be6b6baeeaea99447a91949243d1b05a54398 not found: ID does not exist" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.213688 4755 scope.go:117] "RemoveContainer" containerID="6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9" Oct 06 08:42:38 crc kubenswrapper[4755]: E1006 08:42:38.213954 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9\": container with ID starting with 6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9 not found: ID does not exist" containerID="6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9" Oct 06 08:42:38 crc kubenswrapper[4755]: I1006 08:42:38.214023 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9"} err="failed to get container status \"6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9\": rpc error: code = NotFound desc = could not find container \"6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9\": container with ID starting with 6fa35f50ed7adf2e029dd280ecdeecf85a2d3e0921e0a9bb55fa626c820c1ac9 not found: ID does not exist" Oct 06 08:42:39 crc kubenswrapper[4755]: I1006 08:42:39.156519 4755 generic.go:334] "Generic (PLEG): container finished" podID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerID="17e664427d57ea6f7d5fc4353c0b2d2a9b62c0a16f4031276879d9e6318cfe93" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4755]: I1006 08:42:39.156602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" event={"ID":"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24","Type":"ContainerDied","Data":"17e664427d57ea6f7d5fc4353c0b2d2a9b62c0a16f4031276879d9e6318cfe93"} Oct 06 08:42:39 crc kubenswrapper[4755]: I1006 08:42:39.156920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" event={"ID":"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24","Type":"ContainerStarted","Data":"5cfffcc8f3c81e05c522743b8635eacd659610417002ca39efbef8aefb666b5a"} Oct 06 08:42:39 crc kubenswrapper[4755]: I1006 08:42:39.887956 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" path="/var/lib/kubelet/pods/6bcd099d-2fe7-4237-9338-e7a9aefc1dec/volumes" Oct 06 08:42:40 crc kubenswrapper[4755]: I1006 08:42:40.167969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" event={"ID":"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24","Type":"ContainerStarted","Data":"bde12b463f3c2378a7a470cc022aa6a6180144213d828e3cd662be9efded7378"} Oct 06 08:42:40 crc kubenswrapper[4755]: I1006 08:42:40.168262 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:40 crc kubenswrapper[4755]: I1006 08:42:40.191955 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" podStartSLOduration=3.191924814 podStartE2EDuration="3.191924814s" podCreationTimestamp="2025-10-06 08:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:40.189544885 +0000 UTC m=+1217.018860149" watchObservedRunningTime="2025-10-06 08:42:40.191924814 +0000 UTC m=+1217.021240038" Oct 06 08:42:44 crc kubenswrapper[4755]: E1006 08:42:44.425832 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice/crio-2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0d28dc_714e_4fb4_ab1d_466d6b6ea905.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:42:47 crc kubenswrapper[4755]: I1006 08:42:47.647458 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 08:42:47 crc kubenswrapper[4755]: I1006 08:42:47.714639 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:47 crc kubenswrapper[4755]: I1006 08:42:47.724388 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="dnsmasq-dns" containerID="cri-o://ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc" gracePeriod=10 Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.206130 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.246875 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerID="ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc" exitCode=0 Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.246918 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" event={"ID":"5c8459a9-cd78-49cb-ad57-2f5abb36a053","Type":"ContainerDied","Data":"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc"} Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.246945 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" event={"ID":"5c8459a9-cd78-49cb-ad57-2f5abb36a053","Type":"ContainerDied","Data":"b991d0f9d2d12f93f38190ddc8bcfe0086406e48bd51f1e5c837c1135bf40813"} Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.246962 4755 scope.go:117] "RemoveContainer" containerID="ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.247083 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-fktw7" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.272230 4755 scope.go:117] "RemoveContainer" containerID="fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.300492 4755 scope.go:117] "RemoveContainer" containerID="ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc" Oct 06 08:42:48 crc kubenswrapper[4755]: E1006 08:42:48.301012 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc\": container with ID starting with ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc not found: ID does not exist" containerID="ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.301047 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc"} err="failed to get container status \"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc\": rpc error: code = NotFound desc = could not find container \"ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc\": container with ID starting with ad90ab608d0509fcf1054e5f0edd87bd5dd993992afbbfec92e1ec30d02700dc not found: ID does not exist" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.301069 4755 scope.go:117] "RemoveContainer" containerID="fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f" Oct 06 08:42:48 crc kubenswrapper[4755]: E1006 08:42:48.301445 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f\": container with ID starting with fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f not found: ID does not exist" containerID="fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.301463 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f"} err="failed to get container status \"fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f\": rpc error: code = NotFound desc = could not find container \"fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f\": container with ID starting with fa541728905d97dbc628bd8feeaf44ff32d736bb5240544f6af377e7f065206f not found: ID does not exist" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318307 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318367 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318390 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.318681 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9bk\" (UniqueName: \"kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk\") pod \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\" (UID: \"5c8459a9-cd78-49cb-ad57-2f5abb36a053\") " Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.337134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk" (OuterVolumeSpecName: "kube-api-access-pm9bk") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "kube-api-access-pm9bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.396781 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.417120 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config" (OuterVolumeSpecName: "config") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.420662 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9bk\" (UniqueName: \"kubernetes.io/projected/5c8459a9-cd78-49cb-ad57-2f5abb36a053-kube-api-access-pm9bk\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.420695 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.420707 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.451065 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.452203 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.454987 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "5c8459a9-cd78-49cb-ad57-2f5abb36a053" (UID: "5c8459a9-cd78-49cb-ad57-2f5abb36a053"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.522846 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.522880 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.522889 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c8459a9-cd78-49cb-ad57-2f5abb36a053-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.577468 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.587114 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-fktw7"] Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.912366 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:42:48 crc kubenswrapper[4755]: I1006 08:42:48.912445 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:42:49 crc kubenswrapper[4755]: I1006 08:42:49.891797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" path="/var/lib/kubelet/pods/5c8459a9-cd78-49cb-ad57-2f5abb36a053/volumes" Oct 06 08:42:54 crc kubenswrapper[4755]: E1006 08:42:54.688242 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice/crio-2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0d28dc_714e_4fb4_ab1d_466d6b6ea905.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.957604 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg"] Oct 06 08:42:57 crc kubenswrapper[4755]: E1006 08:42:57.958817 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.958836 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: E1006 08:42:57.958850 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="init" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.958858 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="init" Oct 06 08:42:57 crc kubenswrapper[4755]: E1006 08:42:57.958871 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="init" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.958880 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="init" Oct 06 08:42:57 crc kubenswrapper[4755]: E1006 08:42:57.958909 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.958916 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.959139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8459a9-cd78-49cb-ad57-2f5abb36a053" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.959163 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bcd099d-2fe7-4237-9338-e7a9aefc1dec" containerName="dnsmasq-dns" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.960013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.963868 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.964207 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.964352 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.964497 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:42:57 crc kubenswrapper[4755]: I1006 08:42:57.980866 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg"] Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.091161 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.091412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhdm\" (UniqueName: \"kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.091467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.091612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.193058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.193216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.193265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.193664 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhdm\" (UniqueName: \"kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.198869 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.199497 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.199950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.211424 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhdm\" (UniqueName: \"kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.292414 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.884056 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg"] Oct 06 08:42:58 crc kubenswrapper[4755]: I1006 08:42:58.892612 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:42:59 crc kubenswrapper[4755]: I1006 08:42:59.345718 4755 generic.go:334] "Generic (PLEG): container finished" podID="b378698d-a5e1-4538-93e2-694516a551b1" containerID="7d9377e03af13044f5f644473f4ee0a342aad7e48c362d87f6bb7c5698598c20" exitCode=0 Oct 06 08:42:59 crc kubenswrapper[4755]: I1006 08:42:59.345831 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b378698d-a5e1-4538-93e2-694516a551b1","Type":"ContainerDied","Data":"7d9377e03af13044f5f644473f4ee0a342aad7e48c362d87f6bb7c5698598c20"} Oct 06 08:42:59 crc kubenswrapper[4755]: I1006 08:42:59.348275 4755 generic.go:334] "Generic (PLEG): container finished" podID="5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6" containerID="04eaaec6f0ec12701247b7af45aaf15ddc04c4b56b02577641172549b0711054" exitCode=0 Oct 06 08:42:59 crc kubenswrapper[4755]: I1006 08:42:59.348358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6","Type":"ContainerDied","Data":"04eaaec6f0ec12701247b7af45aaf15ddc04c4b56b02577641172549b0711054"} Oct 06 08:42:59 crc kubenswrapper[4755]: I1006 08:42:59.351321 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" event={"ID":"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a","Type":"ContainerStarted","Data":"04d741ad8031d95dce720f1dacf9fa6caa9e8b82e275b1363b36592f43fd4903"} Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.362698 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6","Type":"ContainerStarted","Data":"a880fb19d570230228135a87154ef892e439aec1a5c473ded54875688e6e2786"} Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.364067 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.368417 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b378698d-a5e1-4538-93e2-694516a551b1","Type":"ContainerStarted","Data":"77cfffa0ee5994c3946e95fec62d902213115a6d57af96bf8f9e599db9652131"} Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.369257 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.393309 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.393289862 podStartE2EDuration="36.393289862s" podCreationTimestamp="2025-10-06 08:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:43:00.385598021 +0000 UTC m=+1237.214913255" watchObservedRunningTime="2025-10-06 08:43:00.393289862 +0000 UTC m=+1237.222605076" Oct 06 08:43:00 crc kubenswrapper[4755]: I1006 08:43:00.412626 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.41260646 podStartE2EDuration="37.41260646s" podCreationTimestamp="2025-10-06 08:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:43:00.410682032 +0000 UTC m=+1237.239997266" watchObservedRunningTime="2025-10-06 08:43:00.41260646 +0000 UTC m=+1237.241921674" Oct 06 08:43:04 crc kubenswrapper[4755]: E1006 08:43:04.945305 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice/crio-2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0d28dc_714e_4fb4_ab1d_466d6b6ea905.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:43:08 crc kubenswrapper[4755]: I1006 08:43:08.449434 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" event={"ID":"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a","Type":"ContainerStarted","Data":"82a81037156df7644363a0830371137a19740acc7e5782bb42d5027d0bb7ed5a"} Oct 06 08:43:08 crc kubenswrapper[4755]: I1006 08:43:08.476707 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" podStartSLOduration=2.9353011799999997 podStartE2EDuration="11.476689647s" podCreationTimestamp="2025-10-06 08:42:57 +0000 UTC" firstStartedPulling="2025-10-06 08:42:58.892395685 +0000 UTC m=+1235.721710889" lastFinishedPulling="2025-10-06 08:43:07.433784142 +0000 UTC m=+1244.263099356" observedRunningTime="2025-10-06 08:43:08.471191911 +0000 UTC m=+1245.300507335" watchObservedRunningTime="2025-10-06 08:43:08.476689647 +0000 UTC m=+1245.306004861" Oct 06 08:43:14 crc kubenswrapper[4755]: I1006 08:43:14.458768 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:43:14 crc kubenswrapper[4755]: I1006 08:43:14.494784 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 08:43:15 crc kubenswrapper[4755]: E1006 08:43:15.192838 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5d33a7_9480_466b_abb7_e8fc7cf08776.slice/crio-2ce1ee4ac36b9fe0943176a7ad5cfee5d6af263da83b5b649ffb7a606d7fd6d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0d28dc_714e_4fb4_ab1d_466d6b6ea905.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.542316 4755 generic.go:334] "Generic (PLEG): container finished" podID="c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" containerID="82a81037156df7644363a0830371137a19740acc7e5782bb42d5027d0bb7ed5a" exitCode=0 Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.542426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" event={"ID":"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a","Type":"ContainerDied","Data":"82a81037156df7644363a0830371137a19740acc7e5782bb42d5027d0bb7ed5a"} Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.912672 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.912758 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.912800 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.913411 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:43:18 crc kubenswrapper[4755]: I1006 08:43:18.913469 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64" gracePeriod=600 Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.552717 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64" exitCode=0 Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.553664 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64"} Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.553695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9"} Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.553713 4755 scope.go:117] "RemoveContainer" containerID="81b36d63c3c7ca9fbafe357e61481e8979d6babd72103e4b42d972dd0f76d2d5" Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.958854 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.973232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key\") pod \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.973274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle\") pod \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.973339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbhdm\" (UniqueName: \"kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm\") pod \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.973401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory\") pod \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\" (UID: \"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a\") " Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.980521 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm" (OuterVolumeSpecName: "kube-api-access-rbhdm") pod "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" (UID: "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a"). InnerVolumeSpecName "kube-api-access-rbhdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:19 crc kubenswrapper[4755]: I1006 08:43:19.980826 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" (UID: "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.011769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" (UID: "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.014528 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory" (OuterVolumeSpecName: "inventory") pod "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" (UID: "c6344d90-5879-472d-8bbd-bd6f7c6c8d7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.075115 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbhdm\" (UniqueName: \"kubernetes.io/projected/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-kube-api-access-rbhdm\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.075145 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.075155 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.075164 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.564222 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" event={"ID":"c6344d90-5879-472d-8bbd-bd6f7c6c8d7a","Type":"ContainerDied","Data":"04d741ad8031d95dce720f1dacf9fa6caa9e8b82e275b1363b36592f43fd4903"} Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.564928 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d741ad8031d95dce720f1dacf9fa6caa9e8b82e275b1363b36592f43fd4903" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.564504 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.654983 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h"] Oct 06 08:43:20 crc kubenswrapper[4755]: E1006 08:43:20.655414 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.655438 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.655657 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.656394 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.658376 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.658614 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.658742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.669931 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.678746 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h"] Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.682850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.683285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.683429 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rd7d\" (UniqueName: \"kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.683584 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.785411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.785520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.785649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rd7d\" (UniqueName: \"kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.785724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.792171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.792204 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.792811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.803916 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rd7d\" (UniqueName: \"kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:20 crc kubenswrapper[4755]: I1006 08:43:20.979641 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:43:21 crc kubenswrapper[4755]: W1006 08:43:21.487128 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe346a8_540e_4e57_8f18_a5d0f2b34232.slice/crio-f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec WatchSource:0}: Error finding container f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec: Status 404 returned error can't find the container with id f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec Oct 06 08:43:21 crc kubenswrapper[4755]: I1006 08:43:21.492242 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h"] Oct 06 08:43:21 crc kubenswrapper[4755]: I1006 08:43:21.574583 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" event={"ID":"afe346a8-540e-4e57-8f18-a5d0f2b34232","Type":"ContainerStarted","Data":"f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec"} Oct 06 08:43:22 crc kubenswrapper[4755]: I1006 08:43:22.583393 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" event={"ID":"afe346a8-540e-4e57-8f18-a5d0f2b34232","Type":"ContainerStarted","Data":"69af5b1fc56f0bdcf35fe08080e54e68f61b67148f803a7889f22b09e620a5f8"} Oct 06 08:44:38 crc kubenswrapper[4755]: I1006 08:44:38.360809 4755 scope.go:117] "RemoveContainer" containerID="b31eaffd9f888a582bc731bff8351ea0ad7cd7b9a7e939e7eea6609331b77fba" Oct 06 08:44:38 crc kubenswrapper[4755]: I1006 08:44:38.398070 4755 scope.go:117] "RemoveContainer" containerID="6f10384233b1d78bf64b174774e2195e2fcef63c60bb6643ffdd5b7665911c6b" Oct 06 08:44:38 crc kubenswrapper[4755]: I1006 08:44:38.445789 4755 scope.go:117] "RemoveContainer" containerID="6474e483a747d46e7481bb383840eec31fc58ad2ead94f9e53a095811837a5a3" Oct 06 08:44:38 crc kubenswrapper[4755]: I1006 08:44:38.468517 4755 scope.go:117] "RemoveContainer" containerID="b9fc1bd844a85c6943ae59878d822af703cec012b06a7c1fb6da58cc1a0b534e" Oct 06 08:44:38 crc kubenswrapper[4755]: I1006 08:44:38.489793 4755 scope.go:117] "RemoveContainer" containerID="fe5d094a36cfb13b7145e44b7df53b7f187e3f71e3b534f5fa8aabea3a9361b2" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.157333 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" podStartSLOduration=99.580111552 podStartE2EDuration="1m40.157310365s" podCreationTimestamp="2025-10-06 08:43:20 +0000 UTC" firstStartedPulling="2025-10-06 08:43:21.489542669 +0000 UTC m=+1258.318857883" lastFinishedPulling="2025-10-06 08:43:22.066741482 +0000 UTC m=+1258.896056696" observedRunningTime="2025-10-06 08:43:22.598554081 +0000 UTC m=+1259.427869295" watchObservedRunningTime="2025-10-06 08:45:00.157310365 +0000 UTC m=+1356.986625579" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.160479 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv"] Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.161773 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.166641 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.166663 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.176593 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv"] Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.318003 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.318064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnz7p\" (UniqueName: \"kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.318735 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.421058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.421181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.421240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnz7p\" (UniqueName: \"kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.422255 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.429659 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.443888 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnz7p\" (UniqueName: \"kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p\") pod \"collect-profiles-29329005-sn7rv\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:00 crc kubenswrapper[4755]: I1006 08:45:00.483749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:01 crc kubenswrapper[4755]: I1006 08:45:01.013516 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv"] Oct 06 08:45:01 crc kubenswrapper[4755]: I1006 08:45:01.512077 4755 generic.go:334] "Generic (PLEG): container finished" podID="db7c743b-75a0-4790-81a2-15f9c39d4624" containerID="085f860133848095c4ac4996dbd52a34afd20d15ef2845340ead8f3447701876" exitCode=0 Oct 06 08:45:01 crc kubenswrapper[4755]: I1006 08:45:01.512155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" event={"ID":"db7c743b-75a0-4790-81a2-15f9c39d4624","Type":"ContainerDied","Data":"085f860133848095c4ac4996dbd52a34afd20d15ef2845340ead8f3447701876"} Oct 06 08:45:01 crc kubenswrapper[4755]: I1006 08:45:01.512448 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" event={"ID":"db7c743b-75a0-4790-81a2-15f9c39d4624","Type":"ContainerStarted","Data":"5e589f0e580762be4392f80b8982c9cb7adb0623ac8a560c37d744101de33713"} Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.877155 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.972267 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume\") pod \"db7c743b-75a0-4790-81a2-15f9c39d4624\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.972380 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume\") pod \"db7c743b-75a0-4790-81a2-15f9c39d4624\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.972498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnz7p\" (UniqueName: \"kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p\") pod \"db7c743b-75a0-4790-81a2-15f9c39d4624\" (UID: \"db7c743b-75a0-4790-81a2-15f9c39d4624\") " Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.973696 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume" (OuterVolumeSpecName: "config-volume") pod "db7c743b-75a0-4790-81a2-15f9c39d4624" (UID: "db7c743b-75a0-4790-81a2-15f9c39d4624"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.980431 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p" (OuterVolumeSpecName: "kube-api-access-vnz7p") pod "db7c743b-75a0-4790-81a2-15f9c39d4624" (UID: "db7c743b-75a0-4790-81a2-15f9c39d4624"). InnerVolumeSpecName "kube-api-access-vnz7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:02 crc kubenswrapper[4755]: I1006 08:45:02.982451 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db7c743b-75a0-4790-81a2-15f9c39d4624" (UID: "db7c743b-75a0-4790-81a2-15f9c39d4624"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.075071 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db7c743b-75a0-4790-81a2-15f9c39d4624-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.075120 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db7c743b-75a0-4790-81a2-15f9c39d4624-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.075129 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnz7p\" (UniqueName: \"kubernetes.io/projected/db7c743b-75a0-4790-81a2-15f9c39d4624-kube-api-access-vnz7p\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.539408 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" event={"ID":"db7c743b-75a0-4790-81a2-15f9c39d4624","Type":"ContainerDied","Data":"5e589f0e580762be4392f80b8982c9cb7adb0623ac8a560c37d744101de33713"} Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.539472 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e589f0e580762be4392f80b8982c9cb7adb0623ac8a560c37d744101de33713" Oct 06 08:45:03 crc kubenswrapper[4755]: I1006 08:45:03.539553 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv" Oct 06 08:45:48 crc kubenswrapper[4755]: I1006 08:45:48.912553 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:45:48 crc kubenswrapper[4755]: I1006 08:45:48.913048 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:46:16 crc kubenswrapper[4755]: I1006 08:46:16.190043 4755 generic.go:334] "Generic (PLEG): container finished" podID="afe346a8-540e-4e57-8f18-a5d0f2b34232" containerID="69af5b1fc56f0bdcf35fe08080e54e68f61b67148f803a7889f22b09e620a5f8" exitCode=0 Oct 06 08:46:16 crc kubenswrapper[4755]: I1006 08:46:16.190123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" event={"ID":"afe346a8-540e-4e57-8f18-a5d0f2b34232","Type":"ContainerDied","Data":"69af5b1fc56f0bdcf35fe08080e54e68f61b67148f803a7889f22b09e620a5f8"} Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.607182 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.761500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key\") pod \"afe346a8-540e-4e57-8f18-a5d0f2b34232\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.762318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory\") pod \"afe346a8-540e-4e57-8f18-a5d0f2b34232\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.762354 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rd7d\" (UniqueName: \"kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d\") pod \"afe346a8-540e-4e57-8f18-a5d0f2b34232\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.762444 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle\") pod \"afe346a8-540e-4e57-8f18-a5d0f2b34232\" (UID: \"afe346a8-540e-4e57-8f18-a5d0f2b34232\") " Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.779344 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d" (OuterVolumeSpecName: "kube-api-access-2rd7d") pod "afe346a8-540e-4e57-8f18-a5d0f2b34232" (UID: "afe346a8-540e-4e57-8f18-a5d0f2b34232"). InnerVolumeSpecName "kube-api-access-2rd7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.780351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "afe346a8-540e-4e57-8f18-a5d0f2b34232" (UID: "afe346a8-540e-4e57-8f18-a5d0f2b34232"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.793099 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "afe346a8-540e-4e57-8f18-a5d0f2b34232" (UID: "afe346a8-540e-4e57-8f18-a5d0f2b34232"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.815805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory" (OuterVolumeSpecName: "inventory") pod "afe346a8-540e-4e57-8f18-a5d0f2b34232" (UID: "afe346a8-540e-4e57-8f18-a5d0f2b34232"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.864208 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.864252 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.864263 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afe346a8-540e-4e57-8f18-a5d0f2b34232-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:17 crc kubenswrapper[4755]: I1006 08:46:17.864275 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rd7d\" (UniqueName: \"kubernetes.io/projected/afe346a8-540e-4e57-8f18-a5d0f2b34232-kube-api-access-2rd7d\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.208863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" event={"ID":"afe346a8-540e-4e57-8f18-a5d0f2b34232","Type":"ContainerDied","Data":"f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec"} Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.208928 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70f169bea6537aa9363b23bce6565ccf2575cd45ba7559ca9557f0342792fec" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.208943 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.316715 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj"] Oct 06 08:46:18 crc kubenswrapper[4755]: E1006 08:46:18.317227 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7c743b-75a0-4790-81a2-15f9c39d4624" containerName="collect-profiles" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.317251 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7c743b-75a0-4790-81a2-15f9c39d4624" containerName="collect-profiles" Oct 06 08:46:18 crc kubenswrapper[4755]: E1006 08:46:18.317285 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe346a8-540e-4e57-8f18-a5d0f2b34232" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.317296 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe346a8-540e-4e57-8f18-a5d0f2b34232" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.317500 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7c743b-75a0-4790-81a2-15f9c39d4624" containerName="collect-profiles" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.317543 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe346a8-540e-4e57-8f18-a5d0f2b34232" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.318354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.321154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.321167 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.321369 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.321532 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.331084 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj"] Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.474839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.475181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgdsp\" (UniqueName: \"kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.475290 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.577460 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgdsp\" (UniqueName: \"kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.577535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.577623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.592114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.594188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.599603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgdsp\" (UniqueName: \"kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.642006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.912670 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:46:18 crc kubenswrapper[4755]: I1006 08:46:18.912974 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:46:19 crc kubenswrapper[4755]: I1006 08:46:19.128649 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj"] Oct 06 08:46:19 crc kubenswrapper[4755]: I1006 08:46:19.222719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" event={"ID":"2158b130-0ef0-452f-bb10-2b6738c19e21","Type":"ContainerStarted","Data":"50baebd4a1192c431f417d4d4009336540fe8869aa95ccd5c6542ceac2a3912d"} Oct 06 08:46:20 crc kubenswrapper[4755]: I1006 08:46:20.234914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" event={"ID":"2158b130-0ef0-452f-bb10-2b6738c19e21","Type":"ContainerStarted","Data":"2cd7dd5c1b5ad7d6285459fc9db698c39cf6bef6bfa97b8bd1a177e30f2bf465"} Oct 06 08:46:20 crc kubenswrapper[4755]: I1006 08:46:20.260630 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" podStartSLOduration=1.743339191 podStartE2EDuration="2.260612151s" podCreationTimestamp="2025-10-06 08:46:18 +0000 UTC" firstStartedPulling="2025-10-06 08:46:19.14198836 +0000 UTC m=+1435.971303574" lastFinishedPulling="2025-10-06 08:46:19.65926132 +0000 UTC m=+1436.488576534" observedRunningTime="2025-10-06 08:46:20.257202527 +0000 UTC m=+1437.086517741" watchObservedRunningTime="2025-10-06 08:46:20.260612151 +0000 UTC m=+1437.089927365" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.068620 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.072546 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.081251 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.173377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.173426 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txn6f\" (UniqueName: \"kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.173545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.274351 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.275106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.275332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.275703 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.275368 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txn6f\" (UniqueName: \"kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.301754 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txn6f\" (UniqueName: \"kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f\") pod \"redhat-marketplace-rhvmf\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.396692 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:24 crc kubenswrapper[4755]: I1006 08:46:24.887643 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:25 crc kubenswrapper[4755]: I1006 08:46:25.282887 4755 generic.go:334] "Generic (PLEG): container finished" podID="555a904b-a660-468a-9a4b-c0920b12c391" containerID="d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c" exitCode=0 Oct 06 08:46:25 crc kubenswrapper[4755]: I1006 08:46:25.282934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerDied","Data":"d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c"} Oct 06 08:46:25 crc kubenswrapper[4755]: I1006 08:46:25.282963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerStarted","Data":"908e53be892f3643cae9e2f9dafae1b86f97596ffc5c1b5db986844729b7bdb3"} Oct 06 08:46:26 crc kubenswrapper[4755]: I1006 08:46:26.293410 4755 generic.go:334] "Generic (PLEG): container finished" podID="555a904b-a660-468a-9a4b-c0920b12c391" containerID="00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf" exitCode=0 Oct 06 08:46:26 crc kubenswrapper[4755]: I1006 08:46:26.293506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerDied","Data":"00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf"} Oct 06 08:46:27 crc kubenswrapper[4755]: I1006 08:46:27.308445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerStarted","Data":"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87"} Oct 06 08:46:27 crc kubenswrapper[4755]: I1006 08:46:27.329247 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rhvmf" podStartSLOduration=1.7301466319999999 podStartE2EDuration="3.329218455s" podCreationTimestamp="2025-10-06 08:46:24 +0000 UTC" firstStartedPulling="2025-10-06 08:46:25.286344901 +0000 UTC m=+1442.115660115" lastFinishedPulling="2025-10-06 08:46:26.885416724 +0000 UTC m=+1443.714731938" observedRunningTime="2025-10-06 08:46:27.327135608 +0000 UTC m=+1444.156450822" watchObservedRunningTime="2025-10-06 08:46:27.329218455 +0000 UTC m=+1444.158533669" Oct 06 08:46:34 crc kubenswrapper[4755]: I1006 08:46:34.397359 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:34 crc kubenswrapper[4755]: I1006 08:46:34.398066 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:34 crc kubenswrapper[4755]: I1006 08:46:34.442709 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:35 crc kubenswrapper[4755]: I1006 08:46:35.430176 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:35 crc kubenswrapper[4755]: I1006 08:46:35.474231 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:37 crc kubenswrapper[4755]: I1006 08:46:37.405423 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rhvmf" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="registry-server" containerID="cri-o://188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87" gracePeriod=2 Oct 06 08:46:37 crc kubenswrapper[4755]: I1006 08:46:37.884286 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.034338 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content\") pod \"555a904b-a660-468a-9a4b-c0920b12c391\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.034535 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities\") pod \"555a904b-a660-468a-9a4b-c0920b12c391\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.034633 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txn6f\" (UniqueName: \"kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f\") pod \"555a904b-a660-468a-9a4b-c0920b12c391\" (UID: \"555a904b-a660-468a-9a4b-c0920b12c391\") " Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.035980 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities" (OuterVolumeSpecName: "utilities") pod "555a904b-a660-468a-9a4b-c0920b12c391" (UID: "555a904b-a660-468a-9a4b-c0920b12c391"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.041858 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f" (OuterVolumeSpecName: "kube-api-access-txn6f") pod "555a904b-a660-468a-9a4b-c0920b12c391" (UID: "555a904b-a660-468a-9a4b-c0920b12c391"). InnerVolumeSpecName "kube-api-access-txn6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.054838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "555a904b-a660-468a-9a4b-c0920b12c391" (UID: "555a904b-a660-468a-9a4b-c0920b12c391"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.137460 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.137508 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555a904b-a660-468a-9a4b-c0920b12c391-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.137541 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txn6f\" (UniqueName: \"kubernetes.io/projected/555a904b-a660-468a-9a4b-c0920b12c391-kube-api-access-txn6f\") on node \"crc\" DevicePath \"\"" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.418183 4755 generic.go:334] "Generic (PLEG): container finished" podID="555a904b-a660-468a-9a4b-c0920b12c391" containerID="188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87" exitCode=0 Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.418252 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhvmf" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.418290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerDied","Data":"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87"} Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.419963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhvmf" event={"ID":"555a904b-a660-468a-9a4b-c0920b12c391","Type":"ContainerDied","Data":"908e53be892f3643cae9e2f9dafae1b86f97596ffc5c1b5db986844729b7bdb3"} Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.420000 4755 scope.go:117] "RemoveContainer" containerID="188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.444609 4755 scope.go:117] "RemoveContainer" containerID="00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.480988 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.484665 4755 scope.go:117] "RemoveContainer" containerID="d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.493923 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhvmf"] Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.522800 4755 scope.go:117] "RemoveContainer" containerID="188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87" Oct 06 08:46:38 crc kubenswrapper[4755]: E1006 08:46:38.528682 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87\": container with ID starting with 188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87 not found: ID does not exist" containerID="188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.528725 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87"} err="failed to get container status \"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87\": rpc error: code = NotFound desc = could not find container \"188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87\": container with ID starting with 188faa44674c1480d550f0c6c08e17d174de3749c1fbb25ee2e84a11c2114b87 not found: ID does not exist" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.528750 4755 scope.go:117] "RemoveContainer" containerID="00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf" Oct 06 08:46:38 crc kubenswrapper[4755]: E1006 08:46:38.529239 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf\": container with ID starting with 00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf not found: ID does not exist" containerID="00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.529262 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf"} err="failed to get container status \"00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf\": rpc error: code = NotFound desc = could not find container \"00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf\": container with ID starting with 00c338920e836f1c58e92db88f34acf33e43e0c30399923a7f4ddf10f0928ebf not found: ID does not exist" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.529275 4755 scope.go:117] "RemoveContainer" containerID="d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c" Oct 06 08:46:38 crc kubenswrapper[4755]: E1006 08:46:38.529528 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c\": container with ID starting with d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c not found: ID does not exist" containerID="d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c" Oct 06 08:46:38 crc kubenswrapper[4755]: I1006 08:46:38.529548 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c"} err="failed to get container status \"d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c\": rpc error: code = NotFound desc = could not find container \"d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c\": container with ID starting with d3bc21c2f47e617c8893829ce2441902d8157cfaaaa1ecaa51f6f46f5ff8452c not found: ID does not exist" Oct 06 08:46:39 crc kubenswrapper[4755]: I1006 08:46:39.905999 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555a904b-a660-468a-9a4b-c0920b12c391" path="/var/lib/kubelet/pods/555a904b-a660-468a-9a4b-c0920b12c391/volumes" Oct 06 08:46:48 crc kubenswrapper[4755]: I1006 08:46:48.912401 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:46:48 crc kubenswrapper[4755]: I1006 08:46:48.913014 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:46:48 crc kubenswrapper[4755]: I1006 08:46:48.913066 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:46:48 crc kubenswrapper[4755]: I1006 08:46:48.927054 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:46:48 crc kubenswrapper[4755]: I1006 08:46:48.927156 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9" gracePeriod=600 Oct 06 08:46:49 crc kubenswrapper[4755]: I1006 08:46:49.521630 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9" exitCode=0 Oct 06 08:46:49 crc kubenswrapper[4755]: I1006 08:46:49.521712 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9"} Oct 06 08:46:49 crc kubenswrapper[4755]: I1006 08:46:49.522098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73"} Oct 06 08:46:49 crc kubenswrapper[4755]: I1006 08:46:49.522121 4755 scope.go:117] "RemoveContainer" containerID="c6f0481014a3fc8cdc1fdc7ef5ec1603dfb57fa2e7007554d45ab50020ac3f64" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.897357 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:06 crc kubenswrapper[4755]: E1006 08:47:06.898423 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="registry-server" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.898435 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="registry-server" Oct 06 08:47:06 crc kubenswrapper[4755]: E1006 08:47:06.898458 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="extract-utilities" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.898465 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="extract-utilities" Oct 06 08:47:06 crc kubenswrapper[4755]: E1006 08:47:06.898484 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="extract-content" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.898491 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="extract-content" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.898727 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="555a904b-a660-468a-9a4b-c0920b12c391" containerName="registry-server" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.900040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:06 crc kubenswrapper[4755]: I1006 08:47:06.905625 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.072240 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcsw\" (UniqueName: \"kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.072334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.072370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.173692 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcsw\" (UniqueName: \"kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.173761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.173787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.174300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.174324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.193880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcsw\" (UniqueName: \"kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw\") pod \"certified-operators-qh4k8\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.286445 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:07 crc kubenswrapper[4755]: I1006 08:47:07.801185 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:08 crc kubenswrapper[4755]: I1006 08:47:08.676276 4755 generic.go:334] "Generic (PLEG): container finished" podID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerID="b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45" exitCode=0 Oct 06 08:47:08 crc kubenswrapper[4755]: I1006 08:47:08.676337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerDied","Data":"b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45"} Oct 06 08:47:08 crc kubenswrapper[4755]: I1006 08:47:08.676556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerStarted","Data":"8e21b378fa3776c0ad7bf72a419755050e3ee992ceccadfd15581fd0381b66ef"} Oct 06 08:47:09 crc kubenswrapper[4755]: I1006 08:47:09.686623 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerStarted","Data":"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a"} Oct 06 08:47:10 crc kubenswrapper[4755]: I1006 08:47:10.696346 4755 generic.go:334] "Generic (PLEG): container finished" podID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerID="890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a" exitCode=0 Oct 06 08:47:10 crc kubenswrapper[4755]: I1006 08:47:10.696397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerDied","Data":"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a"} Oct 06 08:47:11 crc kubenswrapper[4755]: I1006 08:47:11.708983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerStarted","Data":"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f"} Oct 06 08:47:11 crc kubenswrapper[4755]: I1006 08:47:11.736589 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qh4k8" podStartSLOduration=3.265053683 podStartE2EDuration="5.736554098s" podCreationTimestamp="2025-10-06 08:47:06 +0000 UTC" firstStartedPulling="2025-10-06 08:47:08.678318918 +0000 UTC m=+1485.507634122" lastFinishedPulling="2025-10-06 08:47:11.149819323 +0000 UTC m=+1487.979134537" observedRunningTime="2025-10-06 08:47:11.726133789 +0000 UTC m=+1488.555449013" watchObservedRunningTime="2025-10-06 08:47:11.736554098 +0000 UTC m=+1488.565869312" Oct 06 08:47:17 crc kubenswrapper[4755]: I1006 08:47:17.287262 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:17 crc kubenswrapper[4755]: I1006 08:47:17.287878 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:17 crc kubenswrapper[4755]: I1006 08:47:17.334409 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:17 crc kubenswrapper[4755]: I1006 08:47:17.802894 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:17 crc kubenswrapper[4755]: I1006 08:47:17.849005 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:19 crc kubenswrapper[4755]: I1006 08:47:19.782324 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qh4k8" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="registry-server" containerID="cri-o://4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f" gracePeriod=2 Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.193000 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.335140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content\") pod \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.335191 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities\") pod \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.335247 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bcsw\" (UniqueName: \"kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw\") pod \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\" (UID: \"02df6f12-6240-460e-8b5f-77fc3e4b4bda\") " Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.336063 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities" (OuterVolumeSpecName: "utilities") pod "02df6f12-6240-460e-8b5f-77fc3e4b4bda" (UID: "02df6f12-6240-460e-8b5f-77fc3e4b4bda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.341733 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw" (OuterVolumeSpecName: "kube-api-access-6bcsw") pod "02df6f12-6240-460e-8b5f-77fc3e4b4bda" (UID: "02df6f12-6240-460e-8b5f-77fc3e4b4bda"). InnerVolumeSpecName "kube-api-access-6bcsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.378069 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02df6f12-6240-460e-8b5f-77fc3e4b4bda" (UID: "02df6f12-6240-460e-8b5f-77fc3e4b4bda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.437376 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.437403 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df6f12-6240-460e-8b5f-77fc3e4b4bda-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.437412 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bcsw\" (UniqueName: \"kubernetes.io/projected/02df6f12-6240-460e-8b5f-77fc3e4b4bda-kube-api-access-6bcsw\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.795482 4755 generic.go:334] "Generic (PLEG): container finished" podID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerID="4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f" exitCode=0 Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.795630 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh4k8" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.795675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerDied","Data":"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f"} Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.796038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh4k8" event={"ID":"02df6f12-6240-460e-8b5f-77fc3e4b4bda","Type":"ContainerDied","Data":"8e21b378fa3776c0ad7bf72a419755050e3ee992ceccadfd15581fd0381b66ef"} Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.796059 4755 scope.go:117] "RemoveContainer" containerID="4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.817478 4755 scope.go:117] "RemoveContainer" containerID="890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.839957 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.850315 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qh4k8"] Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.861709 4755 scope.go:117] "RemoveContainer" containerID="b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.890382 4755 scope.go:117] "RemoveContainer" containerID="4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f" Oct 06 08:47:20 crc kubenswrapper[4755]: E1006 08:47:20.890806 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f\": container with ID starting with 4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f not found: ID does not exist" containerID="4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.890835 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f"} err="failed to get container status \"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f\": rpc error: code = NotFound desc = could not find container \"4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f\": container with ID starting with 4f543c0951577f9d54ef92c3745b8a7512cb99b4c59106302e32e135ed12ca9f not found: ID does not exist" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.890861 4755 scope.go:117] "RemoveContainer" containerID="890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a" Oct 06 08:47:20 crc kubenswrapper[4755]: E1006 08:47:20.891302 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a\": container with ID starting with 890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a not found: ID does not exist" containerID="890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.891364 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a"} err="failed to get container status \"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a\": rpc error: code = NotFound desc = could not find container \"890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a\": container with ID starting with 890c3f0d73f251cdbda821e2c682d5e2c0ba9cdd3016c624ad1a013860863c0a not found: ID does not exist" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.891391 4755 scope.go:117] "RemoveContainer" containerID="b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45" Oct 06 08:47:20 crc kubenswrapper[4755]: E1006 08:47:20.892054 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45\": container with ID starting with b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45 not found: ID does not exist" containerID="b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45" Oct 06 08:47:20 crc kubenswrapper[4755]: I1006 08:47:20.892089 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45"} err="failed to get container status \"b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45\": rpc error: code = NotFound desc = could not find container \"b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45\": container with ID starting with b0161806d99acb5b5e5d901cddd30be2d723744c48df9d5ae623b6468f53ed45 not found: ID does not exist" Oct 06 08:47:21 crc kubenswrapper[4755]: I1006 08:47:21.895592 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" path="/var/lib/kubelet/pods/02df6f12-6240-460e-8b5f-77fc3e4b4bda/volumes" Oct 06 08:47:26 crc kubenswrapper[4755]: I1006 08:47:26.848327 4755 generic.go:334] "Generic (PLEG): container finished" podID="2158b130-0ef0-452f-bb10-2b6738c19e21" containerID="2cd7dd5c1b5ad7d6285459fc9db698c39cf6bef6bfa97b8bd1a177e30f2bf465" exitCode=0 Oct 06 08:47:26 crc kubenswrapper[4755]: I1006 08:47:26.848392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" event={"ID":"2158b130-0ef0-452f-bb10-2b6738c19e21","Type":"ContainerDied","Data":"2cd7dd5c1b5ad7d6285459fc9db698c39cf6bef6bfa97b8bd1a177e30f2bf465"} Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.230667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.381881 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgdsp\" (UniqueName: \"kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp\") pod \"2158b130-0ef0-452f-bb10-2b6738c19e21\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.382854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key\") pod \"2158b130-0ef0-452f-bb10-2b6738c19e21\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.382892 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory\") pod \"2158b130-0ef0-452f-bb10-2b6738c19e21\" (UID: \"2158b130-0ef0-452f-bb10-2b6738c19e21\") " Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.389103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp" (OuterVolumeSpecName: "kube-api-access-hgdsp") pod "2158b130-0ef0-452f-bb10-2b6738c19e21" (UID: "2158b130-0ef0-452f-bb10-2b6738c19e21"). InnerVolumeSpecName "kube-api-access-hgdsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.408065 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory" (OuterVolumeSpecName: "inventory") pod "2158b130-0ef0-452f-bb10-2b6738c19e21" (UID: "2158b130-0ef0-452f-bb10-2b6738c19e21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.409527 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2158b130-0ef0-452f-bb10-2b6738c19e21" (UID: "2158b130-0ef0-452f-bb10-2b6738c19e21"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.484621 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgdsp\" (UniqueName: \"kubernetes.io/projected/2158b130-0ef0-452f-bb10-2b6738c19e21-kube-api-access-hgdsp\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.484675 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.484690 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2158b130-0ef0-452f-bb10-2b6738c19e21-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.866480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" event={"ID":"2158b130-0ef0-452f-bb10-2b6738c19e21","Type":"ContainerDied","Data":"50baebd4a1192c431f417d4d4009336540fe8869aa95ccd5c6542ceac2a3912d"} Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.866528 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.866533 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50baebd4a1192c431f417d4d4009336540fe8869aa95ccd5c6542ceac2a3912d" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.939834 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk"] Oct 06 08:47:28 crc kubenswrapper[4755]: E1006 08:47:28.940367 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="extract-utilities" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940392 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="extract-utilities" Oct 06 08:47:28 crc kubenswrapper[4755]: E1006 08:47:28.940409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="registry-server" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940417 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="registry-server" Oct 06 08:47:28 crc kubenswrapper[4755]: E1006 08:47:28.940438 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2158b130-0ef0-452f-bb10-2b6738c19e21" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940449 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2158b130-0ef0-452f-bb10-2b6738c19e21" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:28 crc kubenswrapper[4755]: E1006 08:47:28.940471 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="extract-content" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940480 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="extract-content" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940669 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2158b130-0ef0-452f-bb10-2b6738c19e21" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.940693 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="02df6f12-6240-460e-8b5f-77fc3e4b4bda" containerName="registry-server" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.941442 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.944993 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.945188 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.946144 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.946328 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.953118 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk"] Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.993819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.993875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlt5w\" (UniqueName: \"kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:28 crc kubenswrapper[4755]: I1006 08:47:28.994025 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.094850 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.094998 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.095033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlt5w\" (UniqueName: \"kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.098803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.101288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.115079 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlt5w\" (UniqueName: \"kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.267511 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.791589 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk"] Oct 06 08:47:29 crc kubenswrapper[4755]: I1006 08:47:29.875091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" event={"ID":"88915740-2d1e-4127-9c29-497f8d485408","Type":"ContainerStarted","Data":"247ddd964099edec594fe248254281bcbb46e6528e3343212b23d019197b07df"} Oct 06 08:47:30 crc kubenswrapper[4755]: I1006 08:47:30.887012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" event={"ID":"88915740-2d1e-4127-9c29-497f8d485408","Type":"ContainerStarted","Data":"b5f970dde320f1fb2f3b25098fd82abd0290b06722856c40444150c1aa7eff82"} Oct 06 08:47:30 crc kubenswrapper[4755]: I1006 08:47:30.917514 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" podStartSLOduration=2.5103439119999997 podStartE2EDuration="2.917485417s" podCreationTimestamp="2025-10-06 08:47:28 +0000 UTC" firstStartedPulling="2025-10-06 08:47:29.795416477 +0000 UTC m=+1506.624731681" lastFinishedPulling="2025-10-06 08:47:30.202557972 +0000 UTC m=+1507.031873186" observedRunningTime="2025-10-06 08:47:30.903074398 +0000 UTC m=+1507.732389652" watchObservedRunningTime="2025-10-06 08:47:30.917485417 +0000 UTC m=+1507.746800661" Oct 06 08:47:34 crc kubenswrapper[4755]: I1006 08:47:34.919690 4755 generic.go:334] "Generic (PLEG): container finished" podID="88915740-2d1e-4127-9c29-497f8d485408" containerID="b5f970dde320f1fb2f3b25098fd82abd0290b06722856c40444150c1aa7eff82" exitCode=0 Oct 06 08:47:34 crc kubenswrapper[4755]: I1006 08:47:34.920297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" event={"ID":"88915740-2d1e-4127-9c29-497f8d485408","Type":"ContainerDied","Data":"b5f970dde320f1fb2f3b25098fd82abd0290b06722856c40444150c1aa7eff82"} Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.283497 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.426466 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key\") pod \"88915740-2d1e-4127-9c29-497f8d485408\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.426793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlt5w\" (UniqueName: \"kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w\") pod \"88915740-2d1e-4127-9c29-497f8d485408\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.426941 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory\") pod \"88915740-2d1e-4127-9c29-497f8d485408\" (UID: \"88915740-2d1e-4127-9c29-497f8d485408\") " Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.436857 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w" (OuterVolumeSpecName: "kube-api-access-nlt5w") pod "88915740-2d1e-4127-9c29-497f8d485408" (UID: "88915740-2d1e-4127-9c29-497f8d485408"). InnerVolumeSpecName "kube-api-access-nlt5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.456734 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88915740-2d1e-4127-9c29-497f8d485408" (UID: "88915740-2d1e-4127-9c29-497f8d485408"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.460795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory" (OuterVolumeSpecName: "inventory") pod "88915740-2d1e-4127-9c29-497f8d485408" (UID: "88915740-2d1e-4127-9c29-497f8d485408"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.529278 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.529308 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlt5w\" (UniqueName: \"kubernetes.io/projected/88915740-2d1e-4127-9c29-497f8d485408-kube-api-access-nlt5w\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.529320 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88915740-2d1e-4127-9c29-497f8d485408-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.938807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" event={"ID":"88915740-2d1e-4127-9c29-497f8d485408","Type":"ContainerDied","Data":"247ddd964099edec594fe248254281bcbb46e6528e3343212b23d019197b07df"} Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.939070 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="247ddd964099edec594fe248254281bcbb46e6528e3343212b23d019197b07df" Oct 06 08:47:36 crc kubenswrapper[4755]: I1006 08:47:36.938890 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.006159 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf"] Oct 06 08:47:37 crc kubenswrapper[4755]: E1006 08:47:37.007874 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88915740-2d1e-4127-9c29-497f8d485408" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.007901 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="88915740-2d1e-4127-9c29-497f8d485408" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.008101 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="88915740-2d1e-4127-9c29-497f8d485408" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.008784 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.012196 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.012646 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.012868 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.013119 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.019009 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf"] Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.139618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5zc\" (UniqueName: \"kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.139738 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.139767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.240935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5zc\" (UniqueName: \"kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.241120 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.241151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.247849 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.250829 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.258639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5zc\" (UniqueName: \"kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7pdpf\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.335418 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.845930 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf"] Oct 06 08:47:37 crc kubenswrapper[4755]: I1006 08:47:37.951787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" event={"ID":"4931f32e-25ea-4ccc-8b80-83ae4422932c","Type":"ContainerStarted","Data":"6cbe97fe98d0b94478a3eee3445e1f0b5eb1aa8354302df62f20694ce784cf7f"} Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.657360 4755 scope.go:117] "RemoveContainer" containerID="62a97897acc65ad0eea6e88efed9f52644cb2a964c8e2072c5a69fb68e1ff7be" Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.680872 4755 scope.go:117] "RemoveContainer" containerID="b3ddb244271eadc85df60c48c578f543691e5bf8d1f44a978080f2e2f4faff18" Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.700831 4755 scope.go:117] "RemoveContainer" containerID="5cf920348c3c8e57587dff0f1b9fdb867ecf72ea22190d4643c7e60400b43013" Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.723787 4755 scope.go:117] "RemoveContainer" containerID="a47f518b5144094972f3f9136558e17d0941fb5862486077ff04c1d479eb8d5a" Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.964394 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" event={"ID":"4931f32e-25ea-4ccc-8b80-83ae4422932c","Type":"ContainerStarted","Data":"56eca02a3c85d115cabbb953459caa90625a3a701be72b0604bd4e40cafb9eda"} Oct 06 08:47:38 crc kubenswrapper[4755]: I1006 08:47:38.991867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" podStartSLOduration=2.534232454 podStartE2EDuration="2.991803836s" podCreationTimestamp="2025-10-06 08:47:36 +0000 UTC" firstStartedPulling="2025-10-06 08:47:37.85704415 +0000 UTC m=+1514.686359364" lastFinishedPulling="2025-10-06 08:47:38.314615522 +0000 UTC m=+1515.143930746" observedRunningTime="2025-10-06 08:47:38.981292184 +0000 UTC m=+1515.810607418" watchObservedRunningTime="2025-10-06 08:47:38.991803836 +0000 UTC m=+1515.821119070" Oct 06 08:47:54 crc kubenswrapper[4755]: I1006 08:47:54.036198 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pbf24"] Oct 06 08:47:54 crc kubenswrapper[4755]: I1006 08:47:54.044250 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xr272"] Oct 06 08:47:54 crc kubenswrapper[4755]: I1006 08:47:54.053361 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pbf24"] Oct 06 08:47:54 crc kubenswrapper[4755]: I1006 08:47:54.060372 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xr272"] Oct 06 08:47:55 crc kubenswrapper[4755]: I1006 08:47:55.889391 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0214974c-3ea9-468d-84dc-a941cddf9f94" path="/var/lib/kubelet/pods/0214974c-3ea9-468d-84dc-a941cddf9f94/volumes" Oct 06 08:47:55 crc kubenswrapper[4755]: I1006 08:47:55.890176 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edaa26f-c908-44f8-92ea-48f25d7febc3" path="/var/lib/kubelet/pods/9edaa26f-c908-44f8-92ea-48f25d7febc3/volumes" Oct 06 08:47:56 crc kubenswrapper[4755]: I1006 08:47:56.030311 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sjq7l"] Oct 06 08:47:56 crc kubenswrapper[4755]: I1006 08:47:56.042536 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sjq7l"] Oct 06 08:47:57 crc kubenswrapper[4755]: I1006 08:47:57.891353 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3254a8f8-f719-442c-b1a0-31a59dad705e" path="/var/lib/kubelet/pods/3254a8f8-f719-442c-b1a0-31a59dad705e/volumes" Oct 06 08:48:00 crc kubenswrapper[4755]: I1006 08:48:00.035723 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e0a-account-create-ljjqb"] Oct 06 08:48:00 crc kubenswrapper[4755]: I1006 08:48:00.044109 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1e0a-account-create-ljjqb"] Oct 06 08:48:01 crc kubenswrapper[4755]: I1006 08:48:01.892977 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5ce319-fe48-4954-afae-bf595efca444" path="/var/lib/kubelet/pods/3c5ce319-fe48-4954-afae-bf595efca444/volumes" Oct 06 08:48:06 crc kubenswrapper[4755]: I1006 08:48:06.025673 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1de5-account-create-hkvn2"] Oct 06 08:48:06 crc kubenswrapper[4755]: I1006 08:48:06.033048 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1de5-account-create-hkvn2"] Oct 06 08:48:07 crc kubenswrapper[4755]: I1006 08:48:07.889698 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe22dcc-4c4c-43fb-9cc3-5968481dcbb7" path="/var/lib/kubelet/pods/abe22dcc-4c4c-43fb-9cc3-5968481dcbb7/volumes" Oct 06 08:48:10 crc kubenswrapper[4755]: I1006 08:48:10.029091 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-164f-account-create-7fnv7"] Oct 06 08:48:10 crc kubenswrapper[4755]: I1006 08:48:10.039688 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-164f-account-create-7fnv7"] Oct 06 08:48:11 crc kubenswrapper[4755]: I1006 08:48:11.889933 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b7aae3-9795-47d6-89db-2e4f91af9a0e" path="/var/lib/kubelet/pods/98b7aae3-9795-47d6-89db-2e4f91af9a0e/volumes" Oct 06 08:48:12 crc kubenswrapper[4755]: I1006 08:48:12.268932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" event={"ID":"4931f32e-25ea-4ccc-8b80-83ae4422932c","Type":"ContainerDied","Data":"56eca02a3c85d115cabbb953459caa90625a3a701be72b0604bd4e40cafb9eda"} Oct 06 08:48:12 crc kubenswrapper[4755]: I1006 08:48:12.269067 4755 generic.go:334] "Generic (PLEG): container finished" podID="4931f32e-25ea-4ccc-8b80-83ae4422932c" containerID="56eca02a3c85d115cabbb953459caa90625a3a701be72b0604bd4e40cafb9eda" exitCode=0 Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.671877 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.833387 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5zc\" (UniqueName: \"kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc\") pod \"4931f32e-25ea-4ccc-8b80-83ae4422932c\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.833531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key\") pod \"4931f32e-25ea-4ccc-8b80-83ae4422932c\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.833675 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory\") pod \"4931f32e-25ea-4ccc-8b80-83ae4422932c\" (UID: \"4931f32e-25ea-4ccc-8b80-83ae4422932c\") " Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.840034 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc" (OuterVolumeSpecName: "kube-api-access-zk5zc") pod "4931f32e-25ea-4ccc-8b80-83ae4422932c" (UID: "4931f32e-25ea-4ccc-8b80-83ae4422932c"). InnerVolumeSpecName "kube-api-access-zk5zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.867103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory" (OuterVolumeSpecName: "inventory") pod "4931f32e-25ea-4ccc-8b80-83ae4422932c" (UID: "4931f32e-25ea-4ccc-8b80-83ae4422932c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.874842 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4931f32e-25ea-4ccc-8b80-83ae4422932c" (UID: "4931f32e-25ea-4ccc-8b80-83ae4422932c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.936300 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5zc\" (UniqueName: \"kubernetes.io/projected/4931f32e-25ea-4ccc-8b80-83ae4422932c-kube-api-access-zk5zc\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.936342 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:13 crc kubenswrapper[4755]: I1006 08:48:13.936354 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4931f32e-25ea-4ccc-8b80-83ae4422932c-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.285988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" event={"ID":"4931f32e-25ea-4ccc-8b80-83ae4422932c","Type":"ContainerDied","Data":"6cbe97fe98d0b94478a3eee3445e1f0b5eb1aa8354302df62f20694ce784cf7f"} Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.286038 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbe97fe98d0b94478a3eee3445e1f0b5eb1aa8354302df62f20694ce784cf7f" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.286112 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.361404 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz"] Oct 06 08:48:14 crc kubenswrapper[4755]: E1006 08:48:14.361927 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4931f32e-25ea-4ccc-8b80-83ae4422932c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.361956 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4931f32e-25ea-4ccc-8b80-83ae4422932c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.362191 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4931f32e-25ea-4ccc-8b80-83ae4422932c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.362982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.365733 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.366494 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.367552 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.368247 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.369129 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz"] Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.548824 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.548983 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.549620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv5f4\" (UniqueName: \"kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.651720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv5f4\" (UniqueName: \"kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.651798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.651835 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.656966 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.662266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.671733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv5f4\" (UniqueName: \"kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:14 crc kubenswrapper[4755]: I1006 08:48:14.678316 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:15 crc kubenswrapper[4755]: I1006 08:48:15.169083 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz"] Oct 06 08:48:15 crc kubenswrapper[4755]: I1006 08:48:15.174921 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:48:15 crc kubenswrapper[4755]: I1006 08:48:15.294544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" event={"ID":"642222bb-af72-44e4-a6ee-bb8f97e23c93","Type":"ContainerStarted","Data":"6c1919bf268e58e2fbe9157ef5461fd73c605902625db7cf1e289c18d3349d59"} Oct 06 08:48:16 crc kubenswrapper[4755]: I1006 08:48:16.302899 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" event={"ID":"642222bb-af72-44e4-a6ee-bb8f97e23c93","Type":"ContainerStarted","Data":"114f9fa3a9c7636fb5242c2d41989b6580f758570bb9bb9e1a7c030887d08453"} Oct 06 08:48:16 crc kubenswrapper[4755]: I1006 08:48:16.338546 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" podStartSLOduration=1.9376554179999999 podStartE2EDuration="2.338520147s" podCreationTimestamp="2025-10-06 08:48:14 +0000 UTC" firstStartedPulling="2025-10-06 08:48:15.174686076 +0000 UTC m=+1552.004001290" lastFinishedPulling="2025-10-06 08:48:15.575550805 +0000 UTC m=+1552.404866019" observedRunningTime="2025-10-06 08:48:16.334125747 +0000 UTC m=+1553.163440971" watchObservedRunningTime="2025-10-06 08:48:16.338520147 +0000 UTC m=+1553.167835361" Oct 06 08:48:20 crc kubenswrapper[4755]: I1006 08:48:20.334889 4755 generic.go:334] "Generic (PLEG): container finished" podID="642222bb-af72-44e4-a6ee-bb8f97e23c93" containerID="114f9fa3a9c7636fb5242c2d41989b6580f758570bb9bb9e1a7c030887d08453" exitCode=0 Oct 06 08:48:20 crc kubenswrapper[4755]: I1006 08:48:20.334986 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" event={"ID":"642222bb-af72-44e4-a6ee-bb8f97e23c93","Type":"ContainerDied","Data":"114f9fa3a9c7636fb5242c2d41989b6580f758570bb9bb9e1a7c030887d08453"} Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.272857 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.352395 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" event={"ID":"642222bb-af72-44e4-a6ee-bb8f97e23c93","Type":"ContainerDied","Data":"6c1919bf268e58e2fbe9157ef5461fd73c605902625db7cf1e289c18d3349d59"} Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.352432 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c1919bf268e58e2fbe9157ef5461fd73c605902625db7cf1e289c18d3349d59" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.352490 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.398668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key\") pod \"642222bb-af72-44e4-a6ee-bb8f97e23c93\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.399209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv5f4\" (UniqueName: \"kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4\") pod \"642222bb-af72-44e4-a6ee-bb8f97e23c93\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.399325 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory\") pod \"642222bb-af72-44e4-a6ee-bb8f97e23c93\" (UID: \"642222bb-af72-44e4-a6ee-bb8f97e23c93\") " Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.406736 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4" (OuterVolumeSpecName: "kube-api-access-pv5f4") pod "642222bb-af72-44e4-a6ee-bb8f97e23c93" (UID: "642222bb-af72-44e4-a6ee-bb8f97e23c93"). InnerVolumeSpecName "kube-api-access-pv5f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.439186 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7"] Oct 06 08:48:22 crc kubenswrapper[4755]: E1006 08:48:22.439641 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642222bb-af72-44e4-a6ee-bb8f97e23c93" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.439659 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="642222bb-af72-44e4-a6ee-bb8f97e23c93" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.439881 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="642222bb-af72-44e4-a6ee-bb8f97e23c93" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.440794 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.446095 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "642222bb-af72-44e4-a6ee-bb8f97e23c93" (UID: "642222bb-af72-44e4-a6ee-bb8f97e23c93"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.449649 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7"] Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.449883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory" (OuterVolumeSpecName: "inventory") pod "642222bb-af72-44e4-a6ee-bb8f97e23c93" (UID: "642222bb-af72-44e4-a6ee-bb8f97e23c93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.506211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.520058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.520171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsjm\" (UniqueName: \"kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.520372 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv5f4\" (UniqueName: \"kubernetes.io/projected/642222bb-af72-44e4-a6ee-bb8f97e23c93-kube-api-access-pv5f4\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.520390 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.520410 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/642222bb-af72-44e4-a6ee-bb8f97e23c93-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.621732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.621809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsjm\" (UniqueName: \"kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.621928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.628526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.632755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.639984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsjm\" (UniqueName: \"kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:22 crc kubenswrapper[4755]: I1006 08:48:22.840246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:48:23 crc kubenswrapper[4755]: I1006 08:48:23.348021 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7"] Oct 06 08:48:23 crc kubenswrapper[4755]: I1006 08:48:23.362276 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" event={"ID":"30def537-c2c5-4042-bed2-29c3a6f6bc57","Type":"ContainerStarted","Data":"d37930aacb21698f4746e95dbc224b99e09e73c72f69071f5137b866d740e74e"} Oct 06 08:48:24 crc kubenswrapper[4755]: I1006 08:48:24.374787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" event={"ID":"30def537-c2c5-4042-bed2-29c3a6f6bc57","Type":"ContainerStarted","Data":"90f9a6564fc9de9e5c5f2394cd1d105cb2e18666540b5631cfe1890c5dd09252"} Oct 06 08:48:24 crc kubenswrapper[4755]: I1006 08:48:24.406954 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" podStartSLOduration=1.697571183 podStartE2EDuration="2.406932298s" podCreationTimestamp="2025-10-06 08:48:22 +0000 UTC" firstStartedPulling="2025-10-06 08:48:23.341772517 +0000 UTC m=+1560.171087731" lastFinishedPulling="2025-10-06 08:48:24.051133632 +0000 UTC m=+1560.880448846" observedRunningTime="2025-10-06 08:48:24.406805985 +0000 UTC m=+1561.236121219" watchObservedRunningTime="2025-10-06 08:48:24.406932298 +0000 UTC m=+1561.236247512" Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.060022 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bxksl"] Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.073461 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bqjxk"] Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.082117 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bxksl"] Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.088941 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bqjxk"] Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.889253 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e07be36-828e-457f-aa2e-091536b43617" path="/var/lib/kubelet/pods/1e07be36-828e-457f-aa2e-091536b43617/volumes" Oct 06 08:48:27 crc kubenswrapper[4755]: I1006 08:48:27.890065 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b72f6f8-209e-492d-87af-03810abce3bd" path="/var/lib/kubelet/pods/3b72f6f8-209e-492d-87af-03810abce3bd/volumes" Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.027460 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fh9p5"] Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.036426 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fr5h2"] Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.046826 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fh9p5"] Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.054609 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fr5h2"] Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.889724 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5227182b-b51d-46ee-a837-44eb07a36637" path="/var/lib/kubelet/pods/5227182b-b51d-46ee-a837-44eb07a36637/volumes" Oct 06 08:48:31 crc kubenswrapper[4755]: I1006 08:48:31.890409 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f025db-474e-4629-96e7-2ebbd9413fc4" path="/var/lib/kubelet/pods/97f025db-474e-4629-96e7-2ebbd9413fc4/volumes" Oct 06 08:48:35 crc kubenswrapper[4755]: I1006 08:48:35.040898 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qwklr"] Oct 06 08:48:35 crc kubenswrapper[4755]: I1006 08:48:35.053137 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qwklr"] Oct 06 08:48:35 crc kubenswrapper[4755]: I1006 08:48:35.893173 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ceb537e-0d92-47ba-8cf4-470c3caa3765" path="/var/lib/kubelet/pods/8ceb537e-0d92-47ba-8cf4-470c3caa3765/volumes" Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.034933 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ef82-account-create-4v5fj"] Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.043015 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8bef-account-create-2qpz4"] Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.052065 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ef82-account-create-4v5fj"] Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.058936 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8bef-account-create-2qpz4"] Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.828114 4755 scope.go:117] "RemoveContainer" containerID="5ef92c5f3594f985c476edb365b34fba9702a3e539294e14d9e2ea700a89c348" Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.864334 4755 scope.go:117] "RemoveContainer" containerID="3db8da0f0690dd6cb2ab0504c0c382d8510a60669bd723374d03c10babb7d06a" Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.899673 4755 scope.go:117] "RemoveContainer" containerID="8080ac391ff55c171c520aef4b278f7c42fae02f302c38f582405d245bb858e1" Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.942809 4755 scope.go:117] "RemoveContainer" containerID="07b93c917a8e8693bdbd74d3bcc9162c39421cdc741329e94d73d2d049f5c90f" Oct 06 08:48:38 crc kubenswrapper[4755]: I1006 08:48:38.991772 4755 scope.go:117] "RemoveContainer" containerID="41b32dd76f015a9bca1dc4959b99265335c32fe13cccba0971249a9ebae11103" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.030332 4755 scope.go:117] "RemoveContainer" containerID="21339d98c34787c448e9ed0f53bd4cd06bada5f4decadcc27caacb6438473fa5" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.072934 4755 scope.go:117] "RemoveContainer" containerID="9c47c89b198612f8186c7879c23ee531f5979ab794ada6c97e91dbea021675c6" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.092532 4755 scope.go:117] "RemoveContainer" containerID="0bca671031cb203de217a5b6c7ecd4e21457f99bff27da7127a181d9a545765f" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.135180 4755 scope.go:117] "RemoveContainer" containerID="6099ba2225afc605e432fc08f46d189f6b95b425f091591fe2c4aad0d5e20131" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.154025 4755 scope.go:117] "RemoveContainer" containerID="dc7672cfba047a76274ca9505095c74182f23f1290799700674b7119fe58d9b2" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.174172 4755 scope.go:117] "RemoveContainer" containerID="e0d49d956ef0dd0a1d1255ab6ca8ec652b741bcba585ce9cccff8a872551c566" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.890176 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c44bae0-2561-4bfd-9a0d-0f5130838f9c" path="/var/lib/kubelet/pods/1c44bae0-2561-4bfd-9a0d-0f5130838f9c/volumes" Oct 06 08:48:39 crc kubenswrapper[4755]: I1006 08:48:39.891223 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9265504b-7527-495e-86bb-6042cc6ddec7" path="/var/lib/kubelet/pods/9265504b-7527-495e-86bb-6042cc6ddec7/volumes" Oct 06 08:48:51 crc kubenswrapper[4755]: I1006 08:48:51.037219 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fmkjk"] Oct 06 08:48:51 crc kubenswrapper[4755]: I1006 08:48:51.046948 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fmkjk"] Oct 06 08:48:51 crc kubenswrapper[4755]: I1006 08:48:51.888266 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5daa56-27db-42d7-9f80-cb230c855299" path="/var/lib/kubelet/pods/ce5daa56-27db-42d7-9f80-cb230c855299/volumes" Oct 06 08:48:52 crc kubenswrapper[4755]: I1006 08:48:52.029582 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-73af-account-create-cfnhn"] Oct 06 08:48:52 crc kubenswrapper[4755]: I1006 08:48:52.037594 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-73af-account-create-cfnhn"] Oct 06 08:48:53 crc kubenswrapper[4755]: I1006 08:48:53.894230 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370fc33c-40d0-42b7-9822-a44512c4d881" path="/var/lib/kubelet/pods/370fc33c-40d0-42b7-9822-a44512c4d881/volumes" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.357173 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.360354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.370548 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.415089 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.415193 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.415254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjpp\" (UniqueName: \"kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.517704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.517792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.517839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjpp\" (UniqueName: \"kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.518362 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.518375 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.539099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjpp\" (UniqueName: \"kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp\") pod \"community-operators-476mg\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:55 crc kubenswrapper[4755]: I1006 08:48:55.688435 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:48:56 crc kubenswrapper[4755]: I1006 08:48:56.202949 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:48:56 crc kubenswrapper[4755]: I1006 08:48:56.638646 4755 generic.go:334] "Generic (PLEG): container finished" podID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerID="1944fccdb874a9c58e19277840200668ce4712ef33ffe186e3e389d2b45446a0" exitCode=0 Oct 06 08:48:56 crc kubenswrapper[4755]: I1006 08:48:56.638697 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerDied","Data":"1944fccdb874a9c58e19277840200668ce4712ef33ffe186e3e389d2b45446a0"} Oct 06 08:48:56 crc kubenswrapper[4755]: I1006 08:48:56.638923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerStarted","Data":"8dd2996fed73eaec8dd912903076b5013cd5a0da98ee9279f08e0e389d401750"} Oct 06 08:48:57 crc kubenswrapper[4755]: I1006 08:48:57.650077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerStarted","Data":"4a990036cf7a2399eec5476c5ac6aa44082bd166d4b15e31e40359e0932d6375"} Oct 06 08:48:58 crc kubenswrapper[4755]: I1006 08:48:58.661431 4755 generic.go:334] "Generic (PLEG): container finished" podID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerID="4a990036cf7a2399eec5476c5ac6aa44082bd166d4b15e31e40359e0932d6375" exitCode=0 Oct 06 08:48:58 crc kubenswrapper[4755]: I1006 08:48:58.661487 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerDied","Data":"4a990036cf7a2399eec5476c5ac6aa44082bd166d4b15e31e40359e0932d6375"} Oct 06 08:48:59 crc kubenswrapper[4755]: I1006 08:48:59.671651 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerStarted","Data":"33b3e6ea9811e009e13e2b5e5d46fdd5fb4dbbadedd470844635b20d3bacc843"} Oct 06 08:48:59 crc kubenswrapper[4755]: I1006 08:48:59.695314 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-476mg" podStartSLOduration=2.225471765 podStartE2EDuration="4.695294209s" podCreationTimestamp="2025-10-06 08:48:55 +0000 UTC" firstStartedPulling="2025-10-06 08:48:56.640112629 +0000 UTC m=+1593.469427843" lastFinishedPulling="2025-10-06 08:48:59.109935073 +0000 UTC m=+1595.939250287" observedRunningTime="2025-10-06 08:48:59.690991612 +0000 UTC m=+1596.520306836" watchObservedRunningTime="2025-10-06 08:48:59.695294209 +0000 UTC m=+1596.524609423" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.152058 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.154220 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.171660 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.340772 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.341511 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.341700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8mxw\" (UniqueName: \"kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.442976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.443326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.443453 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8mxw\" (UniqueName: \"kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.443477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.443761 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.462884 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8mxw\" (UniqueName: \"kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw\") pod \"redhat-operators-cb9pf\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.473417 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:00 crc kubenswrapper[4755]: I1006 08:49:00.720354 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:00 crc kubenswrapper[4755]: W1006 08:49:00.728824 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17fe854_2e09_42fe_a11a_ee5c6c575514.slice/crio-2563b90e58c8451fc229b367c9129973caa29e092d87a2666cab142112ab79f5 WatchSource:0}: Error finding container 2563b90e58c8451fc229b367c9129973caa29e092d87a2666cab142112ab79f5: Status 404 returned error can't find the container with id 2563b90e58c8451fc229b367c9129973caa29e092d87a2666cab142112ab79f5 Oct 06 08:49:01 crc kubenswrapper[4755]: I1006 08:49:01.698447 4755 generic.go:334] "Generic (PLEG): container finished" podID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerID="584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5" exitCode=0 Oct 06 08:49:01 crc kubenswrapper[4755]: I1006 08:49:01.698582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerDied","Data":"584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5"} Oct 06 08:49:01 crc kubenswrapper[4755]: I1006 08:49:01.698845 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerStarted","Data":"2563b90e58c8451fc229b367c9129973caa29e092d87a2666cab142112ab79f5"} Oct 06 08:49:03 crc kubenswrapper[4755]: E1006 08:49:03.604483 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17fe854_2e09_42fe_a11a_ee5c6c575514.slice/crio-conmon-100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:49:03 crc kubenswrapper[4755]: I1006 08:49:03.720600 4755 generic.go:334] "Generic (PLEG): container finished" podID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerID="100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630" exitCode=0 Oct 06 08:49:03 crc kubenswrapper[4755]: I1006 08:49:03.720657 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerDied","Data":"100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630"} Oct 06 08:49:04 crc kubenswrapper[4755]: I1006 08:49:04.030537 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-95kzq"] Oct 06 08:49:04 crc kubenswrapper[4755]: I1006 08:49:04.041414 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-95kzq"] Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.688638 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.689019 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.745261 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.773616 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerStarted","Data":"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a"} Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.793415 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cb9pf" podStartSLOduration=3.044042225 podStartE2EDuration="5.793398204s" podCreationTimestamp="2025-10-06 08:49:00 +0000 UTC" firstStartedPulling="2025-10-06 08:49:01.701404458 +0000 UTC m=+1598.530719672" lastFinishedPulling="2025-10-06 08:49:04.450760437 +0000 UTC m=+1601.280075651" observedRunningTime="2025-10-06 08:49:05.792970363 +0000 UTC m=+1602.622285597" watchObservedRunningTime="2025-10-06 08:49:05.793398204 +0000 UTC m=+1602.622713418" Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.822364 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:05 crc kubenswrapper[4755]: I1006 08:49:05.889257 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d420f11b-9596-4bbf-9a4c-c13e39020db9" path="/var/lib/kubelet/pods/d420f11b-9596-4bbf-9a4c-c13e39020db9/volumes" Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.344528 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.345083 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-476mg" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="registry-server" containerID="cri-o://33b3e6ea9811e009e13e2b5e5d46fdd5fb4dbbadedd470844635b20d3bacc843" gracePeriod=2 Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.807110 4755 generic.go:334] "Generic (PLEG): container finished" podID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerID="33b3e6ea9811e009e13e2b5e5d46fdd5fb4dbbadedd470844635b20d3bacc843" exitCode=0 Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.807199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerDied","Data":"33b3e6ea9811e009e13e2b5e5d46fdd5fb4dbbadedd470844635b20d3bacc843"} Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.807608 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-476mg" event={"ID":"d15cd369-c62b-4e74-b181-a9760bf3213c","Type":"ContainerDied","Data":"8dd2996fed73eaec8dd912903076b5013cd5a0da98ee9279f08e0e389d401750"} Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.807630 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd2996fed73eaec8dd912903076b5013cd5a0da98ee9279f08e0e389d401750" Oct 06 08:49:09 crc kubenswrapper[4755]: I1006 08:49:09.867994 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.069230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities\") pod \"d15cd369-c62b-4e74-b181-a9760bf3213c\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.069408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjpp\" (UniqueName: \"kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp\") pod \"d15cd369-c62b-4e74-b181-a9760bf3213c\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.069437 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content\") pod \"d15cd369-c62b-4e74-b181-a9760bf3213c\" (UID: \"d15cd369-c62b-4e74-b181-a9760bf3213c\") " Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.070030 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities" (OuterVolumeSpecName: "utilities") pod "d15cd369-c62b-4e74-b181-a9760bf3213c" (UID: "d15cd369-c62b-4e74-b181-a9760bf3213c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.075452 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp" (OuterVolumeSpecName: "kube-api-access-rcjpp") pod "d15cd369-c62b-4e74-b181-a9760bf3213c" (UID: "d15cd369-c62b-4e74-b181-a9760bf3213c"). InnerVolumeSpecName "kube-api-access-rcjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.111875 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d15cd369-c62b-4e74-b181-a9760bf3213c" (UID: "d15cd369-c62b-4e74-b181-a9760bf3213c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.171437 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjpp\" (UniqueName: \"kubernetes.io/projected/d15cd369-c62b-4e74-b181-a9760bf3213c-kube-api-access-rcjpp\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.171472 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.171484 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15cd369-c62b-4e74-b181-a9760bf3213c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.474629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.474690 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.519010 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.815244 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-476mg" Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.849008 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.855245 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-476mg"] Oct 06 08:49:10 crc kubenswrapper[4755]: I1006 08:49:10.868898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:11 crc kubenswrapper[4755]: I1006 08:49:11.031213 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rxxsl"] Oct 06 08:49:11 crc kubenswrapper[4755]: I1006 08:49:11.040156 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rxxsl"] Oct 06 08:49:11 crc kubenswrapper[4755]: I1006 08:49:11.890933 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d1c5df-48c1-4df7-9b04-c19e9510168a" path="/var/lib/kubelet/pods/b2d1c5df-48c1-4df7-9b04-c19e9510168a/volumes" Oct 06 08:49:11 crc kubenswrapper[4755]: I1006 08:49:11.891729 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" path="/var/lib/kubelet/pods/d15cd369-c62b-4e74-b181-a9760bf3213c/volumes" Oct 06 08:49:12 crc kubenswrapper[4755]: I1006 08:49:12.944437 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:12 crc kubenswrapper[4755]: I1006 08:49:12.944961 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cb9pf" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="registry-server" containerID="cri-o://302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a" gracePeriod=2 Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.398724 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.533302 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content\") pod \"b17fe854-2e09-42fe-a11a-ee5c6c575514\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.533473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities\") pod \"b17fe854-2e09-42fe-a11a-ee5c6c575514\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.533614 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8mxw\" (UniqueName: \"kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw\") pod \"b17fe854-2e09-42fe-a11a-ee5c6c575514\" (UID: \"b17fe854-2e09-42fe-a11a-ee5c6c575514\") " Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.534427 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities" (OuterVolumeSpecName: "utilities") pod "b17fe854-2e09-42fe-a11a-ee5c6c575514" (UID: "b17fe854-2e09-42fe-a11a-ee5c6c575514"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.539623 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw" (OuterVolumeSpecName: "kube-api-access-m8mxw") pod "b17fe854-2e09-42fe-a11a-ee5c6c575514" (UID: "b17fe854-2e09-42fe-a11a-ee5c6c575514"). InnerVolumeSpecName "kube-api-access-m8mxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.611857 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b17fe854-2e09-42fe-a11a-ee5c6c575514" (UID: "b17fe854-2e09-42fe-a11a-ee5c6c575514"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.635451 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.635481 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17fe854-2e09-42fe-a11a-ee5c6c575514-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.635494 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8mxw\" (UniqueName: \"kubernetes.io/projected/b17fe854-2e09-42fe-a11a-ee5c6c575514-kube-api-access-m8mxw\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.842791 4755 generic.go:334] "Generic (PLEG): container finished" podID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerID="302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a" exitCode=0 Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.842858 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb9pf" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.842853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerDied","Data":"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a"} Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.843058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb9pf" event={"ID":"b17fe854-2e09-42fe-a11a-ee5c6c575514","Type":"ContainerDied","Data":"2563b90e58c8451fc229b367c9129973caa29e092d87a2666cab142112ab79f5"} Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.843079 4755 scope.go:117] "RemoveContainer" containerID="302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.883027 4755 scope.go:117] "RemoveContainer" containerID="100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.913685 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.920267 4755 scope.go:117] "RemoveContainer" containerID="584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.923763 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cb9pf"] Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.949372 4755 scope.go:117] "RemoveContainer" containerID="302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a" Oct 06 08:49:13 crc kubenswrapper[4755]: E1006 08:49:13.950426 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a\": container with ID starting with 302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a not found: ID does not exist" containerID="302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.950467 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a"} err="failed to get container status \"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a\": rpc error: code = NotFound desc = could not find container \"302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a\": container with ID starting with 302b41b22e6050526eb912a9dc29e54b4f4803da4c48a7e6b451822cd1bda15a not found: ID does not exist" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.950492 4755 scope.go:117] "RemoveContainer" containerID="100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630" Oct 06 08:49:13 crc kubenswrapper[4755]: E1006 08:49:13.951216 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630\": container with ID starting with 100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630 not found: ID does not exist" containerID="100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.951263 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630"} err="failed to get container status \"100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630\": rpc error: code = NotFound desc = could not find container \"100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630\": container with ID starting with 100389756d64af0c70c14dfb434907ed44280b0b0ce5a86144e828e0081d0630 not found: ID does not exist" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.951294 4755 scope.go:117] "RemoveContainer" containerID="584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5" Oct 06 08:49:13 crc kubenswrapper[4755]: E1006 08:49:13.952008 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5\": container with ID starting with 584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5 not found: ID does not exist" containerID="584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5" Oct 06 08:49:13 crc kubenswrapper[4755]: I1006 08:49:13.952038 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5"} err="failed to get container status \"584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5\": rpc error: code = NotFound desc = could not find container \"584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5\": container with ID starting with 584f2df1a5b6744d76917d985e42bae9d68d206a78184ad7841be0a1111b9df5 not found: ID does not exist" Oct 06 08:49:15 crc kubenswrapper[4755]: I1006 08:49:15.891083 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" path="/var/lib/kubelet/pods/b17fe854-2e09-42fe-a11a-ee5c6c575514/volumes" Oct 06 08:49:17 crc kubenswrapper[4755]: I1006 08:49:17.879549 4755 generic.go:334] "Generic (PLEG): container finished" podID="30def537-c2c5-4042-bed2-29c3a6f6bc57" containerID="90f9a6564fc9de9e5c5f2394cd1d105cb2e18666540b5631cfe1890c5dd09252" exitCode=2 Oct 06 08:49:17 crc kubenswrapper[4755]: I1006 08:49:17.888013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" event={"ID":"30def537-c2c5-4042-bed2-29c3a6f6bc57","Type":"ContainerDied","Data":"90f9a6564fc9de9e5c5f2394cd1d105cb2e18666540b5631cfe1890c5dd09252"} Oct 06 08:49:18 crc kubenswrapper[4755]: I1006 08:49:18.912437 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:49:18 crc kubenswrapper[4755]: I1006 08:49:18.912818 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.255055 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.343796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcsjm\" (UniqueName: \"kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm\") pod \"30def537-c2c5-4042-bed2-29c3a6f6bc57\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.343999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory\") pod \"30def537-c2c5-4042-bed2-29c3a6f6bc57\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.344087 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key\") pod \"30def537-c2c5-4042-bed2-29c3a6f6bc57\" (UID: \"30def537-c2c5-4042-bed2-29c3a6f6bc57\") " Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.350752 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm" (OuterVolumeSpecName: "kube-api-access-zcsjm") pod "30def537-c2c5-4042-bed2-29c3a6f6bc57" (UID: "30def537-c2c5-4042-bed2-29c3a6f6bc57"). InnerVolumeSpecName "kube-api-access-zcsjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.372206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory" (OuterVolumeSpecName: "inventory") pod "30def537-c2c5-4042-bed2-29c3a6f6bc57" (UID: "30def537-c2c5-4042-bed2-29c3a6f6bc57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.373537 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30def537-c2c5-4042-bed2-29c3a6f6bc57" (UID: "30def537-c2c5-4042-bed2-29c3a6f6bc57"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.445640 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.445671 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcsjm\" (UniqueName: \"kubernetes.io/projected/30def537-c2c5-4042-bed2-29c3a6f6bc57-kube-api-access-zcsjm\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.445682 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30def537-c2c5-4042-bed2-29c3a6f6bc57-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.895595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" event={"ID":"30def537-c2c5-4042-bed2-29c3a6f6bc57","Type":"ContainerDied","Data":"d37930aacb21698f4746e95dbc224b99e09e73c72f69071f5137b866d740e74e"} Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.895640 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37930aacb21698f4746e95dbc224b99e09e73c72f69071f5137b866d740e74e" Oct 06 08:49:19 crc kubenswrapper[4755]: I1006 08:49:19.895669 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.035299 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp"] Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.036881 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="extract-utilities" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.036916 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="extract-utilities" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.036952 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30def537-c2c5-4042-bed2-29c3a6f6bc57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.036976 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="30def537-c2c5-4042-bed2-29c3a6f6bc57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.037014 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="extract-content" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037032 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="extract-content" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.037062 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037080 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.037102 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="extract-content" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037119 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="extract-content" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.037152 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="extract-utilities" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037169 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="extract-utilities" Oct 06 08:49:26 crc kubenswrapper[4755]: E1006 08:49:26.037213 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037230 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037729 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="30def537-c2c5-4042-bed2-29c3a6f6bc57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037777 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15cd369-c62b-4e74-b181-a9760bf3213c" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.037796 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17fe854-2e09-42fe-a11a-ee5c6c575514" containerName="registry-server" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.039064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.041719 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.044142 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.044800 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.045071 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.045379 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp"] Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.152807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdnx5\" (UniqueName: \"kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.152892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.152957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.254585 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.254776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdnx5\" (UniqueName: \"kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.254838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.261531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.262126 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.276541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdnx5\" (UniqueName: \"kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.372039 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.881317 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp"] Oct 06 08:49:26 crc kubenswrapper[4755]: I1006 08:49:26.975079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" event={"ID":"0a0700ef-3f60-458c-a388-927d24dfaed2","Type":"ContainerStarted","Data":"94b1c5c63a3503618ad5b408acb2ba42093949782a1600dbac828c9eb566703d"} Oct 06 08:49:27 crc kubenswrapper[4755]: I1006 08:49:27.039060 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dsb4x"] Oct 06 08:49:27 crc kubenswrapper[4755]: I1006 08:49:27.049667 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dsb4x"] Oct 06 08:49:27 crc kubenswrapper[4755]: I1006 08:49:27.889736 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c" path="/var/lib/kubelet/pods/8c6f5eb2-4ba0-4d5c-badd-a0ddb2da6f5c/volumes" Oct 06 08:49:27 crc kubenswrapper[4755]: I1006 08:49:27.986149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" event={"ID":"0a0700ef-3f60-458c-a388-927d24dfaed2","Type":"ContainerStarted","Data":"f520a9d9452bd920b5143f99eb59204b102921ed39d7806c079293ef55561a18"} Oct 06 08:49:28 crc kubenswrapper[4755]: I1006 08:49:28.006866 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" podStartSLOduration=1.436737272 podStartE2EDuration="2.006833181s" podCreationTimestamp="2025-10-06 08:49:26 +0000 UTC" firstStartedPulling="2025-10-06 08:49:26.893824069 +0000 UTC m=+1623.723139283" lastFinishedPulling="2025-10-06 08:49:27.463919978 +0000 UTC m=+1624.293235192" observedRunningTime="2025-10-06 08:49:28.000556804 +0000 UTC m=+1624.829872018" watchObservedRunningTime="2025-10-06 08:49:28.006833181 +0000 UTC m=+1624.836148395" Oct 06 08:49:28 crc kubenswrapper[4755]: I1006 08:49:28.031553 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-c6wg6"] Oct 06 08:49:28 crc kubenswrapper[4755]: I1006 08:49:28.038806 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-c6wg6"] Oct 06 08:49:29 crc kubenswrapper[4755]: I1006 08:49:29.887987 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9755bfc9-d53e-4848-8d4b-04fdef46a4ea" path="/var/lib/kubelet/pods/9755bfc9-d53e-4848-8d4b-04fdef46a4ea/volumes" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.365006 4755 scope.go:117] "RemoveContainer" containerID="d9a7deb04f821c6bd0ff5f1901592f8f21a22dcd3b6b6bbbfedd54dc74916100" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.408125 4755 scope.go:117] "RemoveContainer" containerID="ef3094cd8d71e856b999d3329d9b79ae34f3cc0b0ce5d51134fb7b1e3e422508" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.440841 4755 scope.go:117] "RemoveContainer" containerID="5fd8320b76815c6a615e0b5aec5fd5060e9162980330e15da428dbf614ae81f5" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.493210 4755 scope.go:117] "RemoveContainer" containerID="1fbc26523512b85dd55613bbac44bccfff876e7d095b758502899c48aed5d694" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.513348 4755 scope.go:117] "RemoveContainer" containerID="94c8ad26391b632e5f7b578218c32b8a57b58c1653ba4ed7c4c638eef2b92a24" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.585282 4755 scope.go:117] "RemoveContainer" containerID="7a7cce3628c1484c9173dd39feb5725d6bd6cf8b44a4216de400932d6e57b963" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.607179 4755 scope.go:117] "RemoveContainer" containerID="d93a136dc69d666c5a1b24a5ce09b5f163fa7c3a458a70322d40a2a2ece5d440" Oct 06 08:49:39 crc kubenswrapper[4755]: I1006 08:49:39.650019 4755 scope.go:117] "RemoveContainer" containerID="c6c16b1460f709c0c887bf57cf3edf06a0fb001a6edc33d3179ca7308720e9e2" Oct 06 08:49:48 crc kubenswrapper[4755]: I1006 08:49:48.912077 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:49:48 crc kubenswrapper[4755]: I1006 08:49:48.912608 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.041085 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t749f"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.050697 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p85gv"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.063311 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9hrkm"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.072269 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t749f"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.081165 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p85gv"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.088847 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9hrkm"] Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.888663 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092ec804-1b49-4994-94f3-2051535bb3bf" path="/var/lib/kubelet/pods/092ec804-1b49-4994-94f3-2051535bb3bf/volumes" Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.889504 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cf827c-cb1d-42b1-a5e0-63854c591bdf" path="/var/lib/kubelet/pods/20cf827c-cb1d-42b1-a5e0-63854c591bdf/volumes" Oct 06 08:49:53 crc kubenswrapper[4755]: I1006 08:49:53.890023 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22360df0-a5e3-45dd-95b7-ddec07373964" path="/var/lib/kubelet/pods/22360df0-a5e3-45dd-95b7-ddec07373964/volumes" Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.028791 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1f63-account-create-59mh2"] Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.038015 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a0db-account-create-z75g9"] Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.045933 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1f63-account-create-59mh2"] Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.053105 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a0db-account-create-z75g9"] Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.394273 4755 generic.go:334] "Generic (PLEG): container finished" podID="0a0700ef-3f60-458c-a388-927d24dfaed2" containerID="f520a9d9452bd920b5143f99eb59204b102921ed39d7806c079293ef55561a18" exitCode=0 Oct 06 08:50:08 crc kubenswrapper[4755]: I1006 08:50:08.394330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" event={"ID":"0a0700ef-3f60-458c-a388-927d24dfaed2","Type":"ContainerDied","Data":"f520a9d9452bd920b5143f99eb59204b102921ed39d7806c079293ef55561a18"} Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.030247 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d6ff-account-create-zx2lm"] Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.036497 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d6ff-account-create-zx2lm"] Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.798927 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.889281 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74cae32a-c0bb-4798-9466-198d0da08a4c" path="/var/lib/kubelet/pods/74cae32a-c0bb-4798-9466-198d0da08a4c/volumes" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.889901 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab35ed23-84a2-4096-abf1-43a71d39e29b" path="/var/lib/kubelet/pods/ab35ed23-84a2-4096-abf1-43a71d39e29b/volumes" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.890466 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20dd398-8259-4013-b75a-ef645050819e" path="/var/lib/kubelet/pods/d20dd398-8259-4013-b75a-ef645050819e/volumes" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.958660 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key\") pod \"0a0700ef-3f60-458c-a388-927d24dfaed2\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.958766 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdnx5\" (UniqueName: \"kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5\") pod \"0a0700ef-3f60-458c-a388-927d24dfaed2\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.958929 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory\") pod \"0a0700ef-3f60-458c-a388-927d24dfaed2\" (UID: \"0a0700ef-3f60-458c-a388-927d24dfaed2\") " Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.964599 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5" (OuterVolumeSpecName: "kube-api-access-pdnx5") pod "0a0700ef-3f60-458c-a388-927d24dfaed2" (UID: "0a0700ef-3f60-458c-a388-927d24dfaed2"). InnerVolumeSpecName "kube-api-access-pdnx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.988836 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a0700ef-3f60-458c-a388-927d24dfaed2" (UID: "0a0700ef-3f60-458c-a388-927d24dfaed2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:09 crc kubenswrapper[4755]: I1006 08:50:09.989355 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory" (OuterVolumeSpecName: "inventory") pod "0a0700ef-3f60-458c-a388-927d24dfaed2" (UID: "0a0700ef-3f60-458c-a388-927d24dfaed2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.062115 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.062157 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdnx5\" (UniqueName: \"kubernetes.io/projected/0a0700ef-3f60-458c-a388-927d24dfaed2-kube-api-access-pdnx5\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.062174 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0700ef-3f60-458c-a388-927d24dfaed2-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.417036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" event={"ID":"0a0700ef-3f60-458c-a388-927d24dfaed2","Type":"ContainerDied","Data":"94b1c5c63a3503618ad5b408acb2ba42093949782a1600dbac828c9eb566703d"} Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.417697 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b1c5c63a3503618ad5b408acb2ba42093949782a1600dbac828c9eb566703d" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.417087 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.492074 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bchc9"] Oct 06 08:50:10 crc kubenswrapper[4755]: E1006 08:50:10.492639 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0700ef-3f60-458c-a388-927d24dfaed2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.492706 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0700ef-3f60-458c-a388-927d24dfaed2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.492950 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0700ef-3f60-458c-a388-927d24dfaed2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.493621 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.495902 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.496340 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.497752 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.500493 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.504568 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bchc9"] Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.672790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc97d\" (UniqueName: \"kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.673143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.673313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.775460 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc97d\" (UniqueName: \"kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.775561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.775655 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.779539 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.779982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.798720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc97d\" (UniqueName: \"kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d\") pod \"ssh-known-hosts-edpm-deployment-bchc9\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:10 crc kubenswrapper[4755]: I1006 08:50:10.809370 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:11 crc kubenswrapper[4755]: I1006 08:50:11.318819 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bchc9"] Oct 06 08:50:11 crc kubenswrapper[4755]: I1006 08:50:11.425862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" event={"ID":"436cff83-cfd5-431a-bd35-519543c0a74c","Type":"ContainerStarted","Data":"0b19f55334027c3c74a4a315770e421f36a5fe9849c117878efc847690bb8cd7"} Oct 06 08:50:12 crc kubenswrapper[4755]: I1006 08:50:12.440061 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" event={"ID":"436cff83-cfd5-431a-bd35-519543c0a74c","Type":"ContainerStarted","Data":"829f151605817342884f44c5890146f5e6640b9a2b097f97b8838073ff75abbd"} Oct 06 08:50:12 crc kubenswrapper[4755]: I1006 08:50:12.455077 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" podStartSLOduration=1.8921392689999998 podStartE2EDuration="2.455051766s" podCreationTimestamp="2025-10-06 08:50:10 +0000 UTC" firstStartedPulling="2025-10-06 08:50:11.333591112 +0000 UTC m=+1668.162906336" lastFinishedPulling="2025-10-06 08:50:11.896503619 +0000 UTC m=+1668.725818833" observedRunningTime="2025-10-06 08:50:12.453858746 +0000 UTC m=+1669.283173980" watchObservedRunningTime="2025-10-06 08:50:12.455051766 +0000 UTC m=+1669.284366980" Oct 06 08:50:18 crc kubenswrapper[4755]: I1006 08:50:18.912236 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:50:18 crc kubenswrapper[4755]: I1006 08:50:18.912708 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:50:18 crc kubenswrapper[4755]: I1006 08:50:18.912754 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:50:18 crc kubenswrapper[4755]: I1006 08:50:18.913466 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:50:18 crc kubenswrapper[4755]: I1006 08:50:18.913520 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" gracePeriod=600 Oct 06 08:50:19 crc kubenswrapper[4755]: E1006 08:50:19.044918 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.497437 4755 generic.go:334] "Generic (PLEG): container finished" podID="436cff83-cfd5-431a-bd35-519543c0a74c" containerID="829f151605817342884f44c5890146f5e6640b9a2b097f97b8838073ff75abbd" exitCode=0 Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.497553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" event={"ID":"436cff83-cfd5-431a-bd35-519543c0a74c","Type":"ContainerDied","Data":"829f151605817342884f44c5890146f5e6640b9a2b097f97b8838073ff75abbd"} Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.500375 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" exitCode=0 Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.500414 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73"} Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.500454 4755 scope.go:117] "RemoveContainer" containerID="37b01df043f3f9837ed355e230bc753aeb1a969fd4ba5cafcaadc04ae46cebd9" Oct 06 08:50:19 crc kubenswrapper[4755]: I1006 08:50:19.501818 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:50:19 crc kubenswrapper[4755]: E1006 08:50:19.502263 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:50:20 crc kubenswrapper[4755]: I1006 08:50:20.934654 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.061556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc97d\" (UniqueName: \"kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d\") pod \"436cff83-cfd5-431a-bd35-519543c0a74c\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.061649 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam\") pod \"436cff83-cfd5-431a-bd35-519543c0a74c\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.061738 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0\") pod \"436cff83-cfd5-431a-bd35-519543c0a74c\" (UID: \"436cff83-cfd5-431a-bd35-519543c0a74c\") " Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.068507 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d" (OuterVolumeSpecName: "kube-api-access-vc97d") pod "436cff83-cfd5-431a-bd35-519543c0a74c" (UID: "436cff83-cfd5-431a-bd35-519543c0a74c"). InnerVolumeSpecName "kube-api-access-vc97d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.089274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "436cff83-cfd5-431a-bd35-519543c0a74c" (UID: "436cff83-cfd5-431a-bd35-519543c0a74c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.090692 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "436cff83-cfd5-431a-bd35-519543c0a74c" (UID: "436cff83-cfd5-431a-bd35-519543c0a74c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.164539 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc97d\" (UniqueName: \"kubernetes.io/projected/436cff83-cfd5-431a-bd35-519543c0a74c-kube-api-access-vc97d\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.164585 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.164597 4755 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/436cff83-cfd5-431a-bd35-519543c0a74c-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.520724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" event={"ID":"436cff83-cfd5-431a-bd35-519543c0a74c","Type":"ContainerDied","Data":"0b19f55334027c3c74a4a315770e421f36a5fe9849c117878efc847690bb8cd7"} Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.520772 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b19f55334027c3c74a4a315770e421f36a5fe9849c117878efc847690bb8cd7" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.520781 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bchc9" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.582443 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl"] Oct 06 08:50:21 crc kubenswrapper[4755]: E1006 08:50:21.582963 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436cff83-cfd5-431a-bd35-519543c0a74c" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.582982 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="436cff83-cfd5-431a-bd35-519543c0a74c" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.583193 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="436cff83-cfd5-431a-bd35-519543c0a74c" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.583937 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.585894 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.586098 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.587275 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.595640 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.599134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl"] Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.675234 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.675331 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.675384 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prlc\" (UniqueName: \"kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.777104 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prlc\" (UniqueName: \"kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.777206 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.777275 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.780878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.781432 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.798810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prlc\" (UniqueName: \"kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fcrl\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:21 crc kubenswrapper[4755]: I1006 08:50:21.903349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:22 crc kubenswrapper[4755]: I1006 08:50:22.435314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl"] Oct 06 08:50:22 crc kubenswrapper[4755]: I1006 08:50:22.528590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" event={"ID":"3e0b2022-2854-4995-9a75-9a64266ab5ed","Type":"ContainerStarted","Data":"9854b91034bf571cb63d72385cf59a3fe426331421282e373456ff83d1bd3c48"} Oct 06 08:50:23 crc kubenswrapper[4755]: I1006 08:50:23.537717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" event={"ID":"3e0b2022-2854-4995-9a75-9a64266ab5ed","Type":"ContainerStarted","Data":"31b9dcca4ab87e25d626f44e7752eee4605690462bd5f0240edccdcf205e36e2"} Oct 06 08:50:23 crc kubenswrapper[4755]: I1006 08:50:23.560761 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" podStartSLOduration=2.087443936 podStartE2EDuration="2.560739271s" podCreationTimestamp="2025-10-06 08:50:21 +0000 UTC" firstStartedPulling="2025-10-06 08:50:22.447682558 +0000 UTC m=+1679.276997772" lastFinishedPulling="2025-10-06 08:50:22.920977883 +0000 UTC m=+1679.750293107" observedRunningTime="2025-10-06 08:50:23.551885098 +0000 UTC m=+1680.381200312" watchObservedRunningTime="2025-10-06 08:50:23.560739271 +0000 UTC m=+1680.390054485" Oct 06 08:50:30 crc kubenswrapper[4755]: I1006 08:50:30.038030 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lq2rs"] Oct 06 08:50:30 crc kubenswrapper[4755]: I1006 08:50:30.046143 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lq2rs"] Oct 06 08:50:31 crc kubenswrapper[4755]: I1006 08:50:31.598720 4755 generic.go:334] "Generic (PLEG): container finished" podID="3e0b2022-2854-4995-9a75-9a64266ab5ed" containerID="31b9dcca4ab87e25d626f44e7752eee4605690462bd5f0240edccdcf205e36e2" exitCode=0 Oct 06 08:50:31 crc kubenswrapper[4755]: I1006 08:50:31.598773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" event={"ID":"3e0b2022-2854-4995-9a75-9a64266ab5ed","Type":"ContainerDied","Data":"31b9dcca4ab87e25d626f44e7752eee4605690462bd5f0240edccdcf205e36e2"} Oct 06 08:50:31 crc kubenswrapper[4755]: I1006 08:50:31.890321 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1730006e-7d6b-47eb-a254-e04ba1f1a44e" path="/var/lib/kubelet/pods/1730006e-7d6b-47eb-a254-e04ba1f1a44e/volumes" Oct 06 08:50:32 crc kubenswrapper[4755]: I1006 08:50:32.880153 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:50:32 crc kubenswrapper[4755]: E1006 08:50:32.880831 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:50:32 crc kubenswrapper[4755]: I1006 08:50:32.979089 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.079495 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2prlc\" (UniqueName: \"kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc\") pod \"3e0b2022-2854-4995-9a75-9a64266ab5ed\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.079621 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key\") pod \"3e0b2022-2854-4995-9a75-9a64266ab5ed\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.079691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory\") pod \"3e0b2022-2854-4995-9a75-9a64266ab5ed\" (UID: \"3e0b2022-2854-4995-9a75-9a64266ab5ed\") " Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.084749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc" (OuterVolumeSpecName: "kube-api-access-2prlc") pod "3e0b2022-2854-4995-9a75-9a64266ab5ed" (UID: "3e0b2022-2854-4995-9a75-9a64266ab5ed"). InnerVolumeSpecName "kube-api-access-2prlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.105727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory" (OuterVolumeSpecName: "inventory") pod "3e0b2022-2854-4995-9a75-9a64266ab5ed" (UID: "3e0b2022-2854-4995-9a75-9a64266ab5ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.105804 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e0b2022-2854-4995-9a75-9a64266ab5ed" (UID: "3e0b2022-2854-4995-9a75-9a64266ab5ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.181791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2prlc\" (UniqueName: \"kubernetes.io/projected/3e0b2022-2854-4995-9a75-9a64266ab5ed-kube-api-access-2prlc\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.181837 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.181849 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0b2022-2854-4995-9a75-9a64266ab5ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.621385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" event={"ID":"3e0b2022-2854-4995-9a75-9a64266ab5ed","Type":"ContainerDied","Data":"9854b91034bf571cb63d72385cf59a3fe426331421282e373456ff83d1bd3c48"} Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.621465 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9854b91034bf571cb63d72385cf59a3fe426331421282e373456ff83d1bd3c48" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.621513 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.686422 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b"] Oct 06 08:50:33 crc kubenswrapper[4755]: E1006 08:50:33.686780 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0b2022-2854-4995-9a75-9a64266ab5ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.686798 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0b2022-2854-4995-9a75-9a64266ab5ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.686984 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0b2022-2854-4995-9a75-9a64266ab5ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.687617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.692546 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.692545 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.694879 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.696242 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.704116 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b"] Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.793575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.793697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.793741 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmcdf\" (UniqueName: \"kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.896090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.896233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.896297 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmcdf\" (UniqueName: \"kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.901008 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.901639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:33 crc kubenswrapper[4755]: I1006 08:50:33.927837 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmcdf\" (UniqueName: \"kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:34 crc kubenswrapper[4755]: I1006 08:50:34.007273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:34 crc kubenswrapper[4755]: I1006 08:50:34.756538 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b"] Oct 06 08:50:35 crc kubenswrapper[4755]: I1006 08:50:35.642623 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" event={"ID":"84daab56-b383-4b12-88cb-9c4e27a21624","Type":"ContainerStarted","Data":"0aad0944bd8a70026bc775d967a3e332c10d4e33736fbd8512a456f9c8852429"} Oct 06 08:50:35 crc kubenswrapper[4755]: I1006 08:50:35.643020 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" event={"ID":"84daab56-b383-4b12-88cb-9c4e27a21624","Type":"ContainerStarted","Data":"0142e1efc279401148d9227c11c7859ef06a318e09c2b53faf9bd14b38a498a3"} Oct 06 08:50:35 crc kubenswrapper[4755]: I1006 08:50:35.666019 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" podStartSLOduration=2.07690333 podStartE2EDuration="2.666002356s" podCreationTimestamp="2025-10-06 08:50:33 +0000 UTC" firstStartedPulling="2025-10-06 08:50:34.771988358 +0000 UTC m=+1691.601303572" lastFinishedPulling="2025-10-06 08:50:35.361087384 +0000 UTC m=+1692.190402598" observedRunningTime="2025-10-06 08:50:35.661943635 +0000 UTC m=+1692.491258879" watchObservedRunningTime="2025-10-06 08:50:35.666002356 +0000 UTC m=+1692.495317570" Oct 06 08:50:39 crc kubenswrapper[4755]: I1006 08:50:39.815954 4755 scope.go:117] "RemoveContainer" containerID="b325977df06f1ac602cb71fe4d337d10b6a540178b4cab1a1333239e2acb91be" Oct 06 08:50:39 crc kubenswrapper[4755]: I1006 08:50:39.838189 4755 scope.go:117] "RemoveContainer" containerID="9ad411a47f3bac48efa94042cb58e8483025f5bca058e71c4176b1e09a989674" Oct 06 08:50:39 crc kubenswrapper[4755]: I1006 08:50:39.883165 4755 scope.go:117] "RemoveContainer" containerID="a6a9cb6921364042b43b9ea86bf6a31a53c6e7596235570602403e72e7522ba7" Oct 06 08:50:39 crc kubenswrapper[4755]: I1006 08:50:39.922759 4755 scope.go:117] "RemoveContainer" containerID="718bf2facd3e8128dd53b5f599f0d82a100fc801e190e6c3bbf69a779e6bc625" Oct 06 08:50:39 crc kubenswrapper[4755]: I1006 08:50:39.964150 4755 scope.go:117] "RemoveContainer" containerID="ce014ca69cc1ed77df7a3dfb5687b4719459a042d347dd31a02436c239aa6d4d" Oct 06 08:50:40 crc kubenswrapper[4755]: I1006 08:50:40.005966 4755 scope.go:117] "RemoveContainer" containerID="602ebd4a59a29fd43ba48888cb66ba4e3ab2e58b06ce762f1bac8fd5ad26b4a2" Oct 06 08:50:40 crc kubenswrapper[4755]: I1006 08:50:40.047859 4755 scope.go:117] "RemoveContainer" containerID="8b136dda21744b14dcde7049379ef128262393a590b1c50a30730d9117bf1c0d" Oct 06 08:50:44 crc kubenswrapper[4755]: I1006 08:50:44.725332 4755 generic.go:334] "Generic (PLEG): container finished" podID="84daab56-b383-4b12-88cb-9c4e27a21624" containerID="0aad0944bd8a70026bc775d967a3e332c10d4e33736fbd8512a456f9c8852429" exitCode=0 Oct 06 08:50:44 crc kubenswrapper[4755]: I1006 08:50:44.725432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" event={"ID":"84daab56-b383-4b12-88cb-9c4e27a21624","Type":"ContainerDied","Data":"0aad0944bd8a70026bc775d967a3e332c10d4e33736fbd8512a456f9c8852429"} Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.163369 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.244713 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key\") pod \"84daab56-b383-4b12-88cb-9c4e27a21624\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.244758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory\") pod \"84daab56-b383-4b12-88cb-9c4e27a21624\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.244780 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmcdf\" (UniqueName: \"kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf\") pod \"84daab56-b383-4b12-88cb-9c4e27a21624\" (UID: \"84daab56-b383-4b12-88cb-9c4e27a21624\") " Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.250714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf" (OuterVolumeSpecName: "kube-api-access-vmcdf") pod "84daab56-b383-4b12-88cb-9c4e27a21624" (UID: "84daab56-b383-4b12-88cb-9c4e27a21624"). InnerVolumeSpecName "kube-api-access-vmcdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.269948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory" (OuterVolumeSpecName: "inventory") pod "84daab56-b383-4b12-88cb-9c4e27a21624" (UID: "84daab56-b383-4b12-88cb-9c4e27a21624"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.270582 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "84daab56-b383-4b12-88cb-9c4e27a21624" (UID: "84daab56-b383-4b12-88cb-9c4e27a21624"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.347047 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.347089 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84daab56-b383-4b12-88cb-9c4e27a21624-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.347102 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmcdf\" (UniqueName: \"kubernetes.io/projected/84daab56-b383-4b12-88cb-9c4e27a21624-kube-api-access-vmcdf\") on node \"crc\" DevicePath \"\"" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.753289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" event={"ID":"84daab56-b383-4b12-88cb-9c4e27a21624","Type":"ContainerDied","Data":"0142e1efc279401148d9227c11c7859ef06a318e09c2b53faf9bd14b38a498a3"} Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.753343 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0142e1efc279401148d9227c11c7859ef06a318e09c2b53faf9bd14b38a498a3" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.753407 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b" Oct 06 08:50:46 crc kubenswrapper[4755]: I1006 08:50:46.879105 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:50:46 crc kubenswrapper[4755]: E1006 08:50:46.879584 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:50:53 crc kubenswrapper[4755]: I1006 08:50:53.044311 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfkm6"] Oct 06 08:50:53 crc kubenswrapper[4755]: I1006 08:50:53.051600 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfkm6"] Oct 06 08:50:53 crc kubenswrapper[4755]: I1006 08:50:53.892193 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce67a97c-6bfd-4684-be25-c82eec5f8237" path="/var/lib/kubelet/pods/ce67a97c-6bfd-4684-be25-c82eec5f8237/volumes" Oct 06 08:50:55 crc kubenswrapper[4755]: I1006 08:50:55.034547 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f45x"] Oct 06 08:50:55 crc kubenswrapper[4755]: I1006 08:50:55.041629 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f45x"] Oct 06 08:50:55 crc kubenswrapper[4755]: I1006 08:50:55.891006 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca576ccd-2a13-4b2c-ab8e-df22112b4711" path="/var/lib/kubelet/pods/ca576ccd-2a13-4b2c-ab8e-df22112b4711/volumes" Oct 06 08:51:00 crc kubenswrapper[4755]: I1006 08:51:00.878853 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:51:00 crc kubenswrapper[4755]: E1006 08:51:00.879674 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:51:15 crc kubenswrapper[4755]: I1006 08:51:15.880120 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:51:15 crc kubenswrapper[4755]: E1006 08:51:15.881767 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:51:26 crc kubenswrapper[4755]: I1006 08:51:26.878328 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:51:26 crc kubenswrapper[4755]: E1006 08:51:26.879015 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:51:37 crc kubenswrapper[4755]: I1006 08:51:37.052656 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dwgtj"] Oct 06 08:51:37 crc kubenswrapper[4755]: I1006 08:51:37.066235 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dwgtj"] Oct 06 08:51:37 crc kubenswrapper[4755]: I1006 08:51:37.891687 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffe843e-b0c1-40ab-a1ad-d412b46c03e3" path="/var/lib/kubelet/pods/9ffe843e-b0c1-40ab-a1ad-d412b46c03e3/volumes" Oct 06 08:51:40 crc kubenswrapper[4755]: I1006 08:51:40.191368 4755 scope.go:117] "RemoveContainer" containerID="2b8ba3be8bf9e372d59574f01a2aafd9839e1f4bf9178e3242945ac088719f55" Oct 06 08:51:40 crc kubenswrapper[4755]: I1006 08:51:40.252585 4755 scope.go:117] "RemoveContainer" containerID="b211b2176638185955f3684109429af42717d7e2754bc90bbe094b52115dd34f" Oct 06 08:51:40 crc kubenswrapper[4755]: I1006 08:51:40.330165 4755 scope.go:117] "RemoveContainer" containerID="8aac7e5406185c0e45a14415ceb0bf3f16b3de7488b4f5a15ee14ac46996736f" Oct 06 08:51:41 crc kubenswrapper[4755]: I1006 08:51:41.879204 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:51:41 crc kubenswrapper[4755]: E1006 08:51:41.879797 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:51:52 crc kubenswrapper[4755]: I1006 08:51:52.878282 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:51:52 crc kubenswrapper[4755]: E1006 08:51:52.878914 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:52:07 crc kubenswrapper[4755]: I1006 08:52:07.879339 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:52:07 crc kubenswrapper[4755]: E1006 08:52:07.880104 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:52:19 crc kubenswrapper[4755]: I1006 08:52:19.879990 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:52:19 crc kubenswrapper[4755]: E1006 08:52:19.881159 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:52:33 crc kubenswrapper[4755]: I1006 08:52:33.889110 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:52:33 crc kubenswrapper[4755]: E1006 08:52:33.890902 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:52:47 crc kubenswrapper[4755]: I1006 08:52:47.878782 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:52:47 crc kubenswrapper[4755]: E1006 08:52:47.879533 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:53:02 crc kubenswrapper[4755]: I1006 08:53:02.879291 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:53:02 crc kubenswrapper[4755]: E1006 08:53:02.880059 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:53:16 crc kubenswrapper[4755]: I1006 08:53:16.878653 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:53:16 crc kubenswrapper[4755]: E1006 08:53:16.879370 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:53:30 crc kubenswrapper[4755]: I1006 08:53:30.879105 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:53:30 crc kubenswrapper[4755]: E1006 08:53:30.879865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:53:44 crc kubenswrapper[4755]: I1006 08:53:44.878511 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:53:44 crc kubenswrapper[4755]: E1006 08:53:44.879194 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:53:59 crc kubenswrapper[4755]: I1006 08:53:59.878717 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:53:59 crc kubenswrapper[4755]: E1006 08:53:59.879693 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:54:14 crc kubenswrapper[4755]: I1006 08:54:14.878626 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:54:14 crc kubenswrapper[4755]: E1006 08:54:14.879374 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:54:27 crc kubenswrapper[4755]: I1006 08:54:27.880453 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:54:27 crc kubenswrapper[4755]: E1006 08:54:27.881767 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:54:41 crc kubenswrapper[4755]: I1006 08:54:41.879644 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:54:41 crc kubenswrapper[4755]: E1006 08:54:41.880420 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:54:52 crc kubenswrapper[4755]: I1006 08:54:52.878761 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:54:52 crc kubenswrapper[4755]: E1006 08:54:52.879660 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:55:06 crc kubenswrapper[4755]: I1006 08:55:06.879023 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:55:06 crc kubenswrapper[4755]: E1006 08:55:06.880646 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.264479 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.276796 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.285173 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.292138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.299265 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bchc9"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.305658 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.311076 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bchc9"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.316674 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xk4fp"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.323039 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7pdpf"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.329498 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6cjc7"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.335074 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.340489 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bmqvj"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.348614 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-lwpcz"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.362828 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.370463 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5mlzg"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.376363 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.384586 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.393039 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.399547 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5hl8b"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.406488 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fcrl"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.412373 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dd26h"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.420699 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7fvjk"] Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.891322 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0700ef-3f60-458c-a388-927d24dfaed2" path="/var/lib/kubelet/pods/0a0700ef-3f60-458c-a388-927d24dfaed2/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.892040 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2158b130-0ef0-452f-bb10-2b6738c19e21" path="/var/lib/kubelet/pods/2158b130-0ef0-452f-bb10-2b6738c19e21/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.892661 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30def537-c2c5-4042-bed2-29c3a6f6bc57" path="/var/lib/kubelet/pods/30def537-c2c5-4042-bed2-29c3a6f6bc57/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.893287 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e0b2022-2854-4995-9a75-9a64266ab5ed" path="/var/lib/kubelet/pods/3e0b2022-2854-4995-9a75-9a64266ab5ed/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.894632 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436cff83-cfd5-431a-bd35-519543c0a74c" path="/var/lib/kubelet/pods/436cff83-cfd5-431a-bd35-519543c0a74c/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.895251 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4931f32e-25ea-4ccc-8b80-83ae4422932c" path="/var/lib/kubelet/pods/4931f32e-25ea-4ccc-8b80-83ae4422932c/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.895893 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642222bb-af72-44e4-a6ee-bb8f97e23c93" path="/var/lib/kubelet/pods/642222bb-af72-44e4-a6ee-bb8f97e23c93/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.897318 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84daab56-b383-4b12-88cb-9c4e27a21624" path="/var/lib/kubelet/pods/84daab56-b383-4b12-88cb-9c4e27a21624/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.897937 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88915740-2d1e-4127-9c29-497f8d485408" path="/var/lib/kubelet/pods/88915740-2d1e-4127-9c29-497f8d485408/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.898762 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe346a8-540e-4e57-8f18-a5d0f2b34232" path="/var/lib/kubelet/pods/afe346a8-540e-4e57-8f18-a5d0f2b34232/volumes" Oct 06 08:55:09 crc kubenswrapper[4755]: I1006 08:55:09.899850 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6344d90-5879-472d-8bbd-bd6f7c6c8d7a" path="/var/lib/kubelet/pods/c6344d90-5879-472d-8bbd-bd6f7c6c8d7a/volumes" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.122994 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk"] Oct 06 08:55:15 crc kubenswrapper[4755]: E1006 08:55:15.123937 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84daab56-b383-4b12-88cb-9c4e27a21624" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.123953 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="84daab56-b383-4b12-88cb-9c4e27a21624" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.124222 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="84daab56-b383-4b12-88cb-9c4e27a21624" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.124963 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.126861 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.127110 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.130272 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.130784 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.131841 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.138083 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk"] Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.280058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.280159 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mmdz\" (UniqueName: \"kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.280223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.280252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.280277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.382123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mmdz\" (UniqueName: \"kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.382203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.382228 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.382246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.382302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.390381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.390462 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.391116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.393066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.403232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mmdz\" (UniqueName: \"kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.445382 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.957844 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk"] Oct 06 08:55:15 crc kubenswrapper[4755]: I1006 08:55:15.967120 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:55:16 crc kubenswrapper[4755]: I1006 08:55:16.327801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" event={"ID":"7735ba28-55e1-42e4-8fee-463bb64a240a","Type":"ContainerStarted","Data":"43fdc39488f946a0c4cf4d0d0cfec45ba81ed6e2b3d82267f92960e4aa94bb58"} Oct 06 08:55:17 crc kubenswrapper[4755]: I1006 08:55:17.339918 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" event={"ID":"7735ba28-55e1-42e4-8fee-463bb64a240a","Type":"ContainerStarted","Data":"0459d1beffd0f4d662efe8d0872ecbc6cae26e496562c1cb387a47f1223e897a"} Oct 06 08:55:17 crc kubenswrapper[4755]: I1006 08:55:17.360444 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" podStartSLOduration=1.74251519 podStartE2EDuration="2.360421487s" podCreationTimestamp="2025-10-06 08:55:15 +0000 UTC" firstStartedPulling="2025-10-06 08:55:15.966831931 +0000 UTC m=+1972.796147145" lastFinishedPulling="2025-10-06 08:55:16.584738228 +0000 UTC m=+1973.414053442" observedRunningTime="2025-10-06 08:55:17.358746265 +0000 UTC m=+1974.188061489" watchObservedRunningTime="2025-10-06 08:55:17.360421487 +0000 UTC m=+1974.189736701" Oct 06 08:55:18 crc kubenswrapper[4755]: I1006 08:55:18.880306 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:55:18 crc kubenswrapper[4755]: E1006 08:55:18.881048 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 08:55:27 crc kubenswrapper[4755]: I1006 08:55:27.422730 4755 generic.go:334] "Generic (PLEG): container finished" podID="7735ba28-55e1-42e4-8fee-463bb64a240a" containerID="0459d1beffd0f4d662efe8d0872ecbc6cae26e496562c1cb387a47f1223e897a" exitCode=0 Oct 06 08:55:27 crc kubenswrapper[4755]: I1006 08:55:27.422820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" event={"ID":"7735ba28-55e1-42e4-8fee-463bb64a240a","Type":"ContainerDied","Data":"0459d1beffd0f4d662efe8d0872ecbc6cae26e496562c1cb387a47f1223e897a"} Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.812628 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.847151 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph\") pod \"7735ba28-55e1-42e4-8fee-463bb64a240a\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.847224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mmdz\" (UniqueName: \"kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz\") pod \"7735ba28-55e1-42e4-8fee-463bb64a240a\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.847403 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory\") pod \"7735ba28-55e1-42e4-8fee-463bb64a240a\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.847440 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key\") pod \"7735ba28-55e1-42e4-8fee-463bb64a240a\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.847505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle\") pod \"7735ba28-55e1-42e4-8fee-463bb64a240a\" (UID: \"7735ba28-55e1-42e4-8fee-463bb64a240a\") " Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.853116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7735ba28-55e1-42e4-8fee-463bb64a240a" (UID: "7735ba28-55e1-42e4-8fee-463bb64a240a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.855787 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph" (OuterVolumeSpecName: "ceph") pod "7735ba28-55e1-42e4-8fee-463bb64a240a" (UID: "7735ba28-55e1-42e4-8fee-463bb64a240a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.857334 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz" (OuterVolumeSpecName: "kube-api-access-8mmdz") pod "7735ba28-55e1-42e4-8fee-463bb64a240a" (UID: "7735ba28-55e1-42e4-8fee-463bb64a240a"). InnerVolumeSpecName "kube-api-access-8mmdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.879811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory" (OuterVolumeSpecName: "inventory") pod "7735ba28-55e1-42e4-8fee-463bb64a240a" (UID: "7735ba28-55e1-42e4-8fee-463bb64a240a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.880447 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7735ba28-55e1-42e4-8fee-463bb64a240a" (UID: "7735ba28-55e1-42e4-8fee-463bb64a240a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.950901 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.950939 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mmdz\" (UniqueName: \"kubernetes.io/projected/7735ba28-55e1-42e4-8fee-463bb64a240a-kube-api-access-8mmdz\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.950952 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.950964 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:28 crc kubenswrapper[4755]: I1006 08:55:28.950977 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7735ba28-55e1-42e4-8fee-463bb64a240a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.440119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" event={"ID":"7735ba28-55e1-42e4-8fee-463bb64a240a","Type":"ContainerDied","Data":"43fdc39488f946a0c4cf4d0d0cfec45ba81ed6e2b3d82267f92960e4aa94bb58"} Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.440427 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fdc39488f946a0c4cf4d0d0cfec45ba81ed6e2b3d82267f92960e4aa94bb58" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.440488 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.508680 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k"] Oct 06 08:55:29 crc kubenswrapper[4755]: E1006 08:55:29.509036 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7735ba28-55e1-42e4-8fee-463bb64a240a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.509052 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7735ba28-55e1-42e4-8fee-463bb64a240a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.509235 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7735ba28-55e1-42e4-8fee-463bb64a240a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.509984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.511808 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.512216 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.512350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.512741 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.512855 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.519875 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k"] Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.561715 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.561814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.561859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n2wl\" (UniqueName: \"kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.561889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.561957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.663796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.663954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.664021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n2wl\" (UniqueName: \"kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.664067 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.664148 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.668693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.670135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.670630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.671273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.682390 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n2wl\" (UniqueName: \"kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:29 crc kubenswrapper[4755]: I1006 08:55:29.828268 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:55:30 crc kubenswrapper[4755]: W1006 08:55:30.400013 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e47d328_ee62_4e39_9912_8b08b04189c0.slice/crio-51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1 WatchSource:0}: Error finding container 51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1: Status 404 returned error can't find the container with id 51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1 Oct 06 08:55:30 crc kubenswrapper[4755]: I1006 08:55:30.400606 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k"] Oct 06 08:55:30 crc kubenswrapper[4755]: I1006 08:55:30.449492 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" event={"ID":"7e47d328-ee62-4e39-9912-8b08b04189c0","Type":"ContainerStarted","Data":"51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1"} Oct 06 08:55:32 crc kubenswrapper[4755]: I1006 08:55:32.465686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" event={"ID":"7e47d328-ee62-4e39-9912-8b08b04189c0","Type":"ContainerStarted","Data":"1367266972cf6e158daba983a8927ead38e698225dc3c4239d56ccb48ce906d3"} Oct 06 08:55:32 crc kubenswrapper[4755]: I1006 08:55:32.490231 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" podStartSLOduration=2.575975897 podStartE2EDuration="3.490207777s" podCreationTimestamp="2025-10-06 08:55:29 +0000 UTC" firstStartedPulling="2025-10-06 08:55:30.40235694 +0000 UTC m=+1987.231672154" lastFinishedPulling="2025-10-06 08:55:31.31658882 +0000 UTC m=+1988.145904034" observedRunningTime="2025-10-06 08:55:32.48594016 +0000 UTC m=+1989.315255374" watchObservedRunningTime="2025-10-06 08:55:32.490207777 +0000 UTC m=+1989.319522991" Oct 06 08:55:33 crc kubenswrapper[4755]: I1006 08:55:33.887361 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:55:34 crc kubenswrapper[4755]: I1006 08:55:34.486376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da"} Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.478583 4755 scope.go:117] "RemoveContainer" containerID="56eca02a3c85d115cabbb953459caa90625a3a701be72b0604bd4e40cafb9eda" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.533830 4755 scope.go:117] "RemoveContainer" containerID="82a81037156df7644363a0830371137a19740acc7e5782bb42d5027d0bb7ed5a" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.580944 4755 scope.go:117] "RemoveContainer" containerID="1944fccdb874a9c58e19277840200668ce4712ef33ffe186e3e389d2b45446a0" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.612335 4755 scope.go:117] "RemoveContainer" containerID="69af5b1fc56f0bdcf35fe08080e54e68f61b67148f803a7889f22b09e620a5f8" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.703446 4755 scope.go:117] "RemoveContainer" containerID="f520a9d9452bd920b5143f99eb59204b102921ed39d7806c079293ef55561a18" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.752496 4755 scope.go:117] "RemoveContainer" containerID="90f9a6564fc9de9e5c5f2394cd1d105cb2e18666540b5631cfe1890c5dd09252" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.787410 4755 scope.go:117] "RemoveContainer" containerID="2cd7dd5c1b5ad7d6285459fc9db698c39cf6bef6bfa97b8bd1a177e30f2bf465" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.830967 4755 scope.go:117] "RemoveContainer" containerID="33b3e6ea9811e009e13e2b5e5d46fdd5fb4dbbadedd470844635b20d3bacc843" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.847884 4755 scope.go:117] "RemoveContainer" containerID="114f9fa3a9c7636fb5242c2d41989b6580f758570bb9bb9e1a7c030887d08453" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.873116 4755 scope.go:117] "RemoveContainer" containerID="4a990036cf7a2399eec5476c5ac6aa44082bd166d4b15e31e40359e0932d6375" Oct 06 08:55:40 crc kubenswrapper[4755]: I1006 08:55:40.890949 4755 scope.go:117] "RemoveContainer" containerID="b5f970dde320f1fb2f3b25098fd82abd0290b06722856c40444150c1aa7eff82" Oct 06 08:56:41 crc kubenswrapper[4755]: I1006 08:56:41.032826 4755 scope.go:117] "RemoveContainer" containerID="829f151605817342884f44c5890146f5e6640b9a2b097f97b8838073ff75abbd" Oct 06 08:56:41 crc kubenswrapper[4755]: I1006 08:56:41.075468 4755 scope.go:117] "RemoveContainer" containerID="31b9dcca4ab87e25d626f44e7752eee4605690462bd5f0240edccdcf205e36e2" Oct 06 08:56:41 crc kubenswrapper[4755]: I1006 08:56:41.142245 4755 scope.go:117] "RemoveContainer" containerID="0aad0944bd8a70026bc775d967a3e332c10d4e33736fbd8512a456f9c8852429" Oct 06 08:57:02 crc kubenswrapper[4755]: I1006 08:57:02.229497 4755 generic.go:334] "Generic (PLEG): container finished" podID="7e47d328-ee62-4e39-9912-8b08b04189c0" containerID="1367266972cf6e158daba983a8927ead38e698225dc3c4239d56ccb48ce906d3" exitCode=0 Oct 06 08:57:02 crc kubenswrapper[4755]: I1006 08:57:02.229619 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" event={"ID":"7e47d328-ee62-4e39-9912-8b08b04189c0","Type":"ContainerDied","Data":"1367266972cf6e158daba983a8927ead38e698225dc3c4239d56ccb48ce906d3"} Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.652867 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.744776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key\") pod \"7e47d328-ee62-4e39-9912-8b08b04189c0\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.744909 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory\") pod \"7e47d328-ee62-4e39-9912-8b08b04189c0\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.744966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle\") pod \"7e47d328-ee62-4e39-9912-8b08b04189c0\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.745017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph\") pod \"7e47d328-ee62-4e39-9912-8b08b04189c0\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.745747 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n2wl\" (UniqueName: \"kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl\") pod \"7e47d328-ee62-4e39-9912-8b08b04189c0\" (UID: \"7e47d328-ee62-4e39-9912-8b08b04189c0\") " Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.750685 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl" (OuterVolumeSpecName: "kube-api-access-2n2wl") pod "7e47d328-ee62-4e39-9912-8b08b04189c0" (UID: "7e47d328-ee62-4e39-9912-8b08b04189c0"). InnerVolumeSpecName "kube-api-access-2n2wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.756818 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e47d328-ee62-4e39-9912-8b08b04189c0" (UID: "7e47d328-ee62-4e39-9912-8b08b04189c0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.757472 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph" (OuterVolumeSpecName: "ceph") pod "7e47d328-ee62-4e39-9912-8b08b04189c0" (UID: "7e47d328-ee62-4e39-9912-8b08b04189c0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.772468 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e47d328-ee62-4e39-9912-8b08b04189c0" (UID: "7e47d328-ee62-4e39-9912-8b08b04189c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.772782 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory" (OuterVolumeSpecName: "inventory") pod "7e47d328-ee62-4e39-9912-8b08b04189c0" (UID: "7e47d328-ee62-4e39-9912-8b08b04189c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.847543 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n2wl\" (UniqueName: \"kubernetes.io/projected/7e47d328-ee62-4e39-9912-8b08b04189c0-kube-api-access-2n2wl\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.847591 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.847601 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.847610 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:03 crc kubenswrapper[4755]: I1006 08:57:03.847618 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e47d328-ee62-4e39-9912-8b08b04189c0-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.250696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" event={"ID":"7e47d328-ee62-4e39-9912-8b08b04189c0","Type":"ContainerDied","Data":"51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1"} Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.250735 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51af7f20f81aaf48fe6e5cc275e73ab6bf0aa67a6d90b296d66c3036a183a1d1" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.250752 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.335690 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz"] Oct 06 08:57:04 crc kubenswrapper[4755]: E1006 08:57:04.336122 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e47d328-ee62-4e39-9912-8b08b04189c0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.336148 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e47d328-ee62-4e39-9912-8b08b04189c0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.336377 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e47d328-ee62-4e39-9912-8b08b04189c0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.337159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.339158 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.339951 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.340150 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.340265 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.343022 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.354737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz"] Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.458694 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4x47\" (UniqueName: \"kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.458736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.458821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.458853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.559918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.560038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4x47\" (UniqueName: \"kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.560063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.560123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.565056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.565068 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.573645 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.576712 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4x47\" (UniqueName: \"kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:04 crc kubenswrapper[4755]: I1006 08:57:04.653966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:05 crc kubenswrapper[4755]: I1006 08:57:05.209248 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz"] Oct 06 08:57:05 crc kubenswrapper[4755]: I1006 08:57:05.259012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" event={"ID":"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b","Type":"ContainerStarted","Data":"6163ad8339bed19d624d0e887e0795691885269cd6cbc4641bfeafe622311e7f"} Oct 06 08:57:06 crc kubenswrapper[4755]: I1006 08:57:06.268530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" event={"ID":"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b","Type":"ContainerStarted","Data":"efd7dbab5fb6404d0df5e4ef4812f084c09d21df98c382e61b2c997012a6583c"} Oct 06 08:57:06 crc kubenswrapper[4755]: I1006 08:57:06.286676 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" podStartSLOduration=1.746869014 podStartE2EDuration="2.286639982s" podCreationTimestamp="2025-10-06 08:57:04 +0000 UTC" firstStartedPulling="2025-10-06 08:57:05.195163556 +0000 UTC m=+2082.024478770" lastFinishedPulling="2025-10-06 08:57:05.734934524 +0000 UTC m=+2082.564249738" observedRunningTime="2025-10-06 08:57:06.283637239 +0000 UTC m=+2083.112952473" watchObservedRunningTime="2025-10-06 08:57:06.286639982 +0000 UTC m=+2083.115955196" Oct 06 08:57:28 crc kubenswrapper[4755]: I1006 08:57:28.444045 4755 generic.go:334] "Generic (PLEG): container finished" podID="be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" containerID="efd7dbab5fb6404d0df5e4ef4812f084c09d21df98c382e61b2c997012a6583c" exitCode=0 Oct 06 08:57:28 crc kubenswrapper[4755]: I1006 08:57:28.444104 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" event={"ID":"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b","Type":"ContainerDied","Data":"efd7dbab5fb6404d0df5e4ef4812f084c09d21df98c382e61b2c997012a6583c"} Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.858803 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.979538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph\") pod \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.979666 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory\") pod \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.979705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") pod \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.979765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4x47\" (UniqueName: \"kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47\") pod \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.984482 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47" (OuterVolumeSpecName: "kube-api-access-t4x47") pod "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" (UID: "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b"). InnerVolumeSpecName "kube-api-access-t4x47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:57:29 crc kubenswrapper[4755]: I1006 08:57:29.985607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph" (OuterVolumeSpecName: "ceph") pod "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" (UID: "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:30 crc kubenswrapper[4755]: E1006 08:57:30.001041 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key podName:be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b nodeName:}" failed. No retries permitted until 2025-10-06 08:57:30.501012012 +0000 UTC m=+2107.330327236 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key") pod "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" (UID: "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b") : error deleting /var/lib/kubelet/pods/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b/volume-subpaths: remove /var/lib/kubelet/pods/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b/volume-subpaths: no such file or directory Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.004049 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory" (OuterVolumeSpecName: "inventory") pod "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" (UID: "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.081804 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.082121 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.082133 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4x47\" (UniqueName: \"kubernetes.io/projected/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-kube-api-access-t4x47\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.463382 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" event={"ID":"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b","Type":"ContainerDied","Data":"6163ad8339bed19d624d0e887e0795691885269cd6cbc4641bfeafe622311e7f"} Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.463435 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6163ad8339bed19d624d0e887e0795691885269cd6cbc4641bfeafe622311e7f" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.463435 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.541029 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv"] Oct 06 08:57:30 crc kubenswrapper[4755]: E1006 08:57:30.541461 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.541488 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.541730 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.542428 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.559302 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv"] Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.591647 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") pod \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\" (UID: \"be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b\") " Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.592095 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krl4p\" (UniqueName: \"kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.592191 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.592256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.592312 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.595210 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b" (UID: "be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.693627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.693733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.693781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.693808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krl4p\" (UniqueName: \"kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.693874 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.697072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.697099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.697154 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.708278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krl4p\" (UniqueName: \"kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:30 crc kubenswrapper[4755]: I1006 08:57:30.860276 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:31 crc kubenswrapper[4755]: I1006 08:57:31.359960 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv"] Oct 06 08:57:31 crc kubenswrapper[4755]: I1006 08:57:31.473027 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" event={"ID":"1246bd47-a86b-4708-8124-a77064659911","Type":"ContainerStarted","Data":"51d43824d5185a8aaca6ab2166a819d930c1b9f75b496840e01d7d703e2416d7"} Oct 06 08:57:33 crc kubenswrapper[4755]: I1006 08:57:33.490113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" event={"ID":"1246bd47-a86b-4708-8124-a77064659911","Type":"ContainerStarted","Data":"9d7136e690f01bb1ba46c21454b34cea2c5cd3a83b2395b487c134bec09722bf"} Oct 06 08:57:33 crc kubenswrapper[4755]: I1006 08:57:33.519527 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" podStartSLOduration=2.507696683 podStartE2EDuration="3.51950719s" podCreationTimestamp="2025-10-06 08:57:30 +0000 UTC" firstStartedPulling="2025-10-06 08:57:31.368725491 +0000 UTC m=+2108.198040705" lastFinishedPulling="2025-10-06 08:57:32.380535998 +0000 UTC m=+2109.209851212" observedRunningTime="2025-10-06 08:57:33.510688115 +0000 UTC m=+2110.340003349" watchObservedRunningTime="2025-10-06 08:57:33.51950719 +0000 UTC m=+2110.348822404" Oct 06 08:57:37 crc kubenswrapper[4755]: I1006 08:57:37.520607 4755 generic.go:334] "Generic (PLEG): container finished" podID="1246bd47-a86b-4708-8124-a77064659911" containerID="9d7136e690f01bb1ba46c21454b34cea2c5cd3a83b2395b487c134bec09722bf" exitCode=0 Oct 06 08:57:37 crc kubenswrapper[4755]: I1006 08:57:37.520709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" event={"ID":"1246bd47-a86b-4708-8124-a77064659911","Type":"ContainerDied","Data":"9d7136e690f01bb1ba46c21454b34cea2c5cd3a83b2395b487c134bec09722bf"} Oct 06 08:57:38 crc kubenswrapper[4755]: I1006 08:57:38.914113 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.046818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory\") pod \"1246bd47-a86b-4708-8124-a77064659911\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.046993 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph\") pod \"1246bd47-a86b-4708-8124-a77064659911\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.047022 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krl4p\" (UniqueName: \"kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p\") pod \"1246bd47-a86b-4708-8124-a77064659911\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.047124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key\") pod \"1246bd47-a86b-4708-8124-a77064659911\" (UID: \"1246bd47-a86b-4708-8124-a77064659911\") " Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.052466 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph" (OuterVolumeSpecName: "ceph") pod "1246bd47-a86b-4708-8124-a77064659911" (UID: "1246bd47-a86b-4708-8124-a77064659911"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.052627 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p" (OuterVolumeSpecName: "kube-api-access-krl4p") pod "1246bd47-a86b-4708-8124-a77064659911" (UID: "1246bd47-a86b-4708-8124-a77064659911"). InnerVolumeSpecName "kube-api-access-krl4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.073195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory" (OuterVolumeSpecName: "inventory") pod "1246bd47-a86b-4708-8124-a77064659911" (UID: "1246bd47-a86b-4708-8124-a77064659911"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.076169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1246bd47-a86b-4708-8124-a77064659911" (UID: "1246bd47-a86b-4708-8124-a77064659911"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.149317 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.149363 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krl4p\" (UniqueName: \"kubernetes.io/projected/1246bd47-a86b-4708-8124-a77064659911-kube-api-access-krl4p\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.149374 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.149382 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1246bd47-a86b-4708-8124-a77064659911-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.537387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" event={"ID":"1246bd47-a86b-4708-8124-a77064659911","Type":"ContainerDied","Data":"51d43824d5185a8aaca6ab2166a819d930c1b9f75b496840e01d7d703e2416d7"} Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.537703 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51d43824d5185a8aaca6ab2166a819d930c1b9f75b496840e01d7d703e2416d7" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.537436 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.626779 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld"] Oct 06 08:57:39 crc kubenswrapper[4755]: E1006 08:57:39.627196 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1246bd47-a86b-4708-8124-a77064659911" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.627220 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1246bd47-a86b-4708-8124-a77064659911" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.627464 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1246bd47-a86b-4708-8124-a77064659911" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.628224 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.631498 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.631547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.631547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.631699 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.632065 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.636404 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld"] Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.758542 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjb29\" (UniqueName: \"kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.758759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.758883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.758961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.860071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.860167 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjb29\" (UniqueName: \"kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.860232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.860290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.864465 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.864959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.873434 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.881789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjb29\" (UniqueName: \"kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rgzld\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:39 crc kubenswrapper[4755]: I1006 08:57:39.953313 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:57:40 crc kubenswrapper[4755]: I1006 08:57:40.529960 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld"] Oct 06 08:57:41 crc kubenswrapper[4755]: I1006 08:57:41.559009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" event={"ID":"629dfd56-994c-4d9e-ba10-ecd79d750142","Type":"ContainerStarted","Data":"ef0222ce2f7ae457a14f46539e1251cd5aa7436a39595065458b1a4c9611d8fe"} Oct 06 08:57:41 crc kubenswrapper[4755]: I1006 08:57:41.559248 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" event={"ID":"629dfd56-994c-4d9e-ba10-ecd79d750142","Type":"ContainerStarted","Data":"f921534bef67a9be4890450c4521c8c362423c7ef00d44476042789e8d6f9c35"} Oct 06 08:57:41 crc kubenswrapper[4755]: I1006 08:57:41.580966 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" podStartSLOduration=1.924619196 podStartE2EDuration="2.580944951s" podCreationTimestamp="2025-10-06 08:57:39 +0000 UTC" firstStartedPulling="2025-10-06 08:57:40.545293083 +0000 UTC m=+2117.374608307" lastFinishedPulling="2025-10-06 08:57:41.201618848 +0000 UTC m=+2118.030934062" observedRunningTime="2025-10-06 08:57:41.575691913 +0000 UTC m=+2118.405007137" watchObservedRunningTime="2025-10-06 08:57:41.580944951 +0000 UTC m=+2118.410260165" Oct 06 08:57:48 crc kubenswrapper[4755]: I1006 08:57:48.912065 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:57:48 crc kubenswrapper[4755]: I1006 08:57:48.912664 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.165528 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.168089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.179981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.330723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.330814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.331101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7c8\" (UniqueName: \"kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.433162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.433325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7c8\" (UniqueName: \"kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.433366 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.433793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.433825 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.451431 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7c8\" (UniqueName: \"kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8\") pod \"redhat-marketplace-x2pzz\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.522628 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:57:52 crc kubenswrapper[4755]: I1006 08:57:52.976438 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:57:53 crc kubenswrapper[4755]: I1006 08:57:53.676809 4755 generic.go:334] "Generic (PLEG): container finished" podID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerID="adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb" exitCode=0 Oct 06 08:57:53 crc kubenswrapper[4755]: I1006 08:57:53.676859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerDied","Data":"adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb"} Oct 06 08:57:53 crc kubenswrapper[4755]: I1006 08:57:53.677130 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerStarted","Data":"4743dc0fed08308505569ad5bcef4ae4820275d6d71246ec64996fbb8e9a4b56"} Oct 06 08:57:54 crc kubenswrapper[4755]: I1006 08:57:54.685236 4755 generic.go:334] "Generic (PLEG): container finished" podID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerID="a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616" exitCode=0 Oct 06 08:57:54 crc kubenswrapper[4755]: I1006 08:57:54.685435 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerDied","Data":"a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616"} Oct 06 08:57:55 crc kubenswrapper[4755]: I1006 08:57:55.695949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerStarted","Data":"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba"} Oct 06 08:57:55 crc kubenswrapper[4755]: I1006 08:57:55.717246 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2pzz" podStartSLOduration=2.181775385 podStartE2EDuration="3.717226277s" podCreationTimestamp="2025-10-06 08:57:52 +0000 UTC" firstStartedPulling="2025-10-06 08:57:53.681435997 +0000 UTC m=+2130.510751211" lastFinishedPulling="2025-10-06 08:57:55.216886889 +0000 UTC m=+2132.046202103" observedRunningTime="2025-10-06 08:57:55.713505376 +0000 UTC m=+2132.542820600" watchObservedRunningTime="2025-10-06 08:57:55.717226277 +0000 UTC m=+2132.546541501" Oct 06 08:57:59 crc kubenswrapper[4755]: I1006 08:57:59.924039 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:57:59 crc kubenswrapper[4755]: I1006 08:57:59.926608 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:57:59 crc kubenswrapper[4755]: I1006 08:57:59.937578 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.079735 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr28s\" (UniqueName: \"kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.079785 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.079816 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.181887 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.181953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.182128 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr28s\" (UniqueName: \"kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.182435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.182488 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.216542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr28s\" (UniqueName: \"kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s\") pod \"certified-operators-8jxmp\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.248903 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:00 crc kubenswrapper[4755]: I1006 08:58:00.775251 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:58:01 crc kubenswrapper[4755]: I1006 08:58:01.746377 4755 generic.go:334] "Generic (PLEG): container finished" podID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerID="3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a" exitCode=0 Oct 06 08:58:01 crc kubenswrapper[4755]: I1006 08:58:01.746771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerDied","Data":"3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a"} Oct 06 08:58:01 crc kubenswrapper[4755]: I1006 08:58:01.748669 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerStarted","Data":"76042c07d27c0042042c6d707023e9ab0c5d74853934ab81941db77c0670fa18"} Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.523140 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.523454 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.571255 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.757011 4755 generic.go:334] "Generic (PLEG): container finished" podID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerID="0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126" exitCode=0 Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.757032 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerDied","Data":"0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126"} Oct 06 08:58:02 crc kubenswrapper[4755]: I1006 08:58:02.804812 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:03 crc kubenswrapper[4755]: I1006 08:58:03.766820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerStarted","Data":"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a"} Oct 06 08:58:03 crc kubenswrapper[4755]: I1006 08:58:03.784106 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8jxmp" podStartSLOduration=3.334527937 podStartE2EDuration="4.784087568s" podCreationTimestamp="2025-10-06 08:57:59 +0000 UTC" firstStartedPulling="2025-10-06 08:58:01.748510034 +0000 UTC m=+2138.577825248" lastFinishedPulling="2025-10-06 08:58:03.198069665 +0000 UTC m=+2140.027384879" observedRunningTime="2025-10-06 08:58:03.781083945 +0000 UTC m=+2140.610399159" watchObservedRunningTime="2025-10-06 08:58:03.784087568 +0000 UTC m=+2140.613402782" Oct 06 08:58:04 crc kubenswrapper[4755]: I1006 08:58:04.906452 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:58:04 crc kubenswrapper[4755]: I1006 08:58:04.906899 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2pzz" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="registry-server" containerID="cri-o://9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba" gracePeriod=2 Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.342421 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.362340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities\") pod \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.362413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content\") pod \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.362462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7c8\" (UniqueName: \"kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8\") pod \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\" (UID: \"76462eb9-2b89-4eef-8d47-849c6f1f9d22\") " Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.363438 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities" (OuterVolumeSpecName: "utilities") pod "76462eb9-2b89-4eef-8d47-849c6f1f9d22" (UID: "76462eb9-2b89-4eef-8d47-849c6f1f9d22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.376677 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76462eb9-2b89-4eef-8d47-849c6f1f9d22" (UID: "76462eb9-2b89-4eef-8d47-849c6f1f9d22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.378730 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8" (OuterVolumeSpecName: "kube-api-access-cr7c8") pod "76462eb9-2b89-4eef-8d47-849c6f1f9d22" (UID: "76462eb9-2b89-4eef-8d47-849c6f1f9d22"). InnerVolumeSpecName "kube-api-access-cr7c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.464610 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7c8\" (UniqueName: \"kubernetes.io/projected/76462eb9-2b89-4eef-8d47-849c6f1f9d22-kube-api-access-cr7c8\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.464655 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.464690 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76462eb9-2b89-4eef-8d47-849c6f1f9d22-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.785109 4755 generic.go:334] "Generic (PLEG): container finished" podID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerID="9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba" exitCode=0 Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.785173 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pzz" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.785191 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerDied","Data":"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba"} Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.789749 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pzz" event={"ID":"76462eb9-2b89-4eef-8d47-849c6f1f9d22","Type":"ContainerDied","Data":"4743dc0fed08308505569ad5bcef4ae4820275d6d71246ec64996fbb8e9a4b56"} Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.789791 4755 scope.go:117] "RemoveContainer" containerID="9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.820332 4755 scope.go:117] "RemoveContainer" containerID="a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.822344 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.834186 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pzz"] Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.841275 4755 scope.go:117] "RemoveContainer" containerID="adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.880159 4755 scope.go:117] "RemoveContainer" containerID="9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba" Oct 06 08:58:05 crc kubenswrapper[4755]: E1006 08:58:05.880506 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba\": container with ID starting with 9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba not found: ID does not exist" containerID="9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.880534 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba"} err="failed to get container status \"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba\": rpc error: code = NotFound desc = could not find container \"9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba\": container with ID starting with 9f281cf5608c87815fe34a975b587adf75551eb2782435a946125951702767ba not found: ID does not exist" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.880555 4755 scope.go:117] "RemoveContainer" containerID="a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616" Oct 06 08:58:05 crc kubenswrapper[4755]: E1006 08:58:05.880812 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616\": container with ID starting with a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616 not found: ID does not exist" containerID="a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.880833 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616"} err="failed to get container status \"a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616\": rpc error: code = NotFound desc = could not find container \"a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616\": container with ID starting with a485b1d7a22041c2a0a37eca2cc91926d96145bb14a1e7b051edb32c3d9b5616 not found: ID does not exist" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.880846 4755 scope.go:117] "RemoveContainer" containerID="adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb" Oct 06 08:58:05 crc kubenswrapper[4755]: E1006 08:58:05.881035 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb\": container with ID starting with adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb not found: ID does not exist" containerID="adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.881061 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb"} err="failed to get container status \"adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb\": rpc error: code = NotFound desc = could not find container \"adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb\": container with ID starting with adda7e45212c5f790b6e079b03bee3609a706e59b4ddaa6bc875e29333da74bb not found: ID does not exist" Oct 06 08:58:05 crc kubenswrapper[4755]: I1006 08:58:05.891113 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" path="/var/lib/kubelet/pods/76462eb9-2b89-4eef-8d47-849c6f1f9d22/volumes" Oct 06 08:58:10 crc kubenswrapper[4755]: I1006 08:58:10.249950 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:10 crc kubenswrapper[4755]: I1006 08:58:10.250563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:10 crc kubenswrapper[4755]: I1006 08:58:10.313551 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:10 crc kubenswrapper[4755]: I1006 08:58:10.881392 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:10 crc kubenswrapper[4755]: I1006 08:58:10.922981 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:58:12 crc kubenswrapper[4755]: I1006 08:58:12.848680 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8jxmp" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="registry-server" containerID="cri-o://b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a" gracePeriod=2 Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.372721 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.503318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content\") pod \"129b5629-256b-4afe-aac3-4f06dd4e6030\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.503428 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities\") pod \"129b5629-256b-4afe-aac3-4f06dd4e6030\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.503681 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr28s\" (UniqueName: \"kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s\") pod \"129b5629-256b-4afe-aac3-4f06dd4e6030\" (UID: \"129b5629-256b-4afe-aac3-4f06dd4e6030\") " Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.504796 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities" (OuterVolumeSpecName: "utilities") pod "129b5629-256b-4afe-aac3-4f06dd4e6030" (UID: "129b5629-256b-4afe-aac3-4f06dd4e6030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.512849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s" (OuterVolumeSpecName: "kube-api-access-lr28s") pod "129b5629-256b-4afe-aac3-4f06dd4e6030" (UID: "129b5629-256b-4afe-aac3-4f06dd4e6030"). InnerVolumeSpecName "kube-api-access-lr28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.549499 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "129b5629-256b-4afe-aac3-4f06dd4e6030" (UID: "129b5629-256b-4afe-aac3-4f06dd4e6030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.606984 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr28s\" (UniqueName: \"kubernetes.io/projected/129b5629-256b-4afe-aac3-4f06dd4e6030-kube-api-access-lr28s\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.607028 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.607050 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129b5629-256b-4afe-aac3-4f06dd4e6030-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.860785 4755 generic.go:334] "Generic (PLEG): container finished" podID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerID="b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a" exitCode=0 Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.860840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerDied","Data":"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a"} Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.860861 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jxmp" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.860870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jxmp" event={"ID":"129b5629-256b-4afe-aac3-4f06dd4e6030","Type":"ContainerDied","Data":"76042c07d27c0042042c6d707023e9ab0c5d74853934ab81941db77c0670fa18"} Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.860890 4755 scope.go:117] "RemoveContainer" containerID="b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.906951 4755 scope.go:117] "RemoveContainer" containerID="0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.908654 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.923180 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8jxmp"] Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.941136 4755 scope.go:117] "RemoveContainer" containerID="3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.968328 4755 scope.go:117] "RemoveContainer" containerID="b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a" Oct 06 08:58:13 crc kubenswrapper[4755]: E1006 08:58:13.969351 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a\": container with ID starting with b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a not found: ID does not exist" containerID="b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.969419 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a"} err="failed to get container status \"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a\": rpc error: code = NotFound desc = could not find container \"b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a\": container with ID starting with b77d89e19461a2c049deebbcfb8de9b2d523f33df562d57bb83b9ca5f8eab18a not found: ID does not exist" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.969473 4755 scope.go:117] "RemoveContainer" containerID="0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126" Oct 06 08:58:13 crc kubenswrapper[4755]: E1006 08:58:13.970068 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126\": container with ID starting with 0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126 not found: ID does not exist" containerID="0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.970103 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126"} err="failed to get container status \"0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126\": rpc error: code = NotFound desc = could not find container \"0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126\": container with ID starting with 0a8792e4d57337ee6371c2456d9a795906fdb4163f310503c4026347f8f05126 not found: ID does not exist" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.970128 4755 scope.go:117] "RemoveContainer" containerID="3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a" Oct 06 08:58:13 crc kubenswrapper[4755]: E1006 08:58:13.970538 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a\": container with ID starting with 3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a not found: ID does not exist" containerID="3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a" Oct 06 08:58:13 crc kubenswrapper[4755]: I1006 08:58:13.970571 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a"} err="failed to get container status \"3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a\": rpc error: code = NotFound desc = could not find container \"3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a\": container with ID starting with 3bfdad83f0d7c53b713d16a8ecd647b30c7a431c8ea20aca70dd679b2aacce3a not found: ID does not exist" Oct 06 08:58:14 crc kubenswrapper[4755]: I1006 08:58:14.871365 4755 generic.go:334] "Generic (PLEG): container finished" podID="629dfd56-994c-4d9e-ba10-ecd79d750142" containerID="ef0222ce2f7ae457a14f46539e1251cd5aa7436a39595065458b1a4c9611d8fe" exitCode=0 Oct 06 08:58:14 crc kubenswrapper[4755]: I1006 08:58:14.871891 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" event={"ID":"629dfd56-994c-4d9e-ba10-ecd79d750142","Type":"ContainerDied","Data":"ef0222ce2f7ae457a14f46539e1251cd5aa7436a39595065458b1a4c9611d8fe"} Oct 06 08:58:15 crc kubenswrapper[4755]: I1006 08:58:15.895493 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" path="/var/lib/kubelet/pods/129b5629-256b-4afe-aac3-4f06dd4e6030/volumes" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.267386 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.462965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph\") pod \"629dfd56-994c-4d9e-ba10-ecd79d750142\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.463051 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjb29\" (UniqueName: \"kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29\") pod \"629dfd56-994c-4d9e-ba10-ecd79d750142\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.463103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key\") pod \"629dfd56-994c-4d9e-ba10-ecd79d750142\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.463135 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory\") pod \"629dfd56-994c-4d9e-ba10-ecd79d750142\" (UID: \"629dfd56-994c-4d9e-ba10-ecd79d750142\") " Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.470895 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29" (OuterVolumeSpecName: "kube-api-access-hjb29") pod "629dfd56-994c-4d9e-ba10-ecd79d750142" (UID: "629dfd56-994c-4d9e-ba10-ecd79d750142"). InnerVolumeSpecName "kube-api-access-hjb29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.475882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph" (OuterVolumeSpecName: "ceph") pod "629dfd56-994c-4d9e-ba10-ecd79d750142" (UID: "629dfd56-994c-4d9e-ba10-ecd79d750142"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.490714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "629dfd56-994c-4d9e-ba10-ecd79d750142" (UID: "629dfd56-994c-4d9e-ba10-ecd79d750142"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.491315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory" (OuterVolumeSpecName: "inventory") pod "629dfd56-994c-4d9e-ba10-ecd79d750142" (UID: "629dfd56-994c-4d9e-ba10-ecd79d750142"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.566180 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.566232 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjb29\" (UniqueName: \"kubernetes.io/projected/629dfd56-994c-4d9e-ba10-ecd79d750142-kube-api-access-hjb29\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.566249 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.566260 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629dfd56-994c-4d9e-ba10-ecd79d750142-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.886643 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" event={"ID":"629dfd56-994c-4d9e-ba10-ecd79d750142","Type":"ContainerDied","Data":"f921534bef67a9be4890450c4521c8c362423c7ef00d44476042789e8d6f9c35"} Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.886939 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f921534bef67a9be4890450c4521c8c362423c7ef00d44476042789e8d6f9c35" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.886714 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rgzld" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.974664 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c"] Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975083 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="extract-utilities" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975098 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="extract-utilities" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975117 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629dfd56-994c-4d9e-ba10-ecd79d750142" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975126 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="629dfd56-994c-4d9e-ba10-ecd79d750142" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975141 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="extract-utilities" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975148 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="extract-utilities" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975159 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="extract-content" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975167 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="extract-content" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975179 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="extract-content" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975186 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="extract-content" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975198 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975206 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: E1006 08:58:16.975235 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975242 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975435 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="629dfd56-994c-4d9e-ba10-ecd79d750142" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975452 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="129b5629-256b-4afe-aac3-4f06dd4e6030" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.975466 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="76462eb9-2b89-4eef-8d47-849c6f1f9d22" containerName="registry-server" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.976287 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.978176 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.979947 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.979956 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.981786 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c"] Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.981964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:58:16 crc kubenswrapper[4755]: I1006 08:58:16.981973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.097477 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.097633 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw28w\" (UniqueName: \"kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.097656 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.097698 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.200101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw28w\" (UniqueName: \"kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.200159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.200217 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.200262 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.204284 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.204348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.208059 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.222403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw28w\" (UniqueName: \"kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.299099 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.825403 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c"] Oct 06 08:58:17 crc kubenswrapper[4755]: I1006 08:58:17.894097 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" event={"ID":"0f5c98c8-0bd9-4a85-b547-9c39183abe87","Type":"ContainerStarted","Data":"2f4c60797c24a1783eaaa91e7cc27ecf91b145be894d4d5a213196ca885bf2e5"} Oct 06 08:58:18 crc kubenswrapper[4755]: I1006 08:58:18.901989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" event={"ID":"0f5c98c8-0bd9-4a85-b547-9c39183abe87","Type":"ContainerStarted","Data":"15a141fa74cf786ca780e3d77de9d2bbb58d3552c60e7279d1a004ca5c76c611"} Oct 06 08:58:18 crc kubenswrapper[4755]: I1006 08:58:18.912788 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:58:18 crc kubenswrapper[4755]: I1006 08:58:18.912857 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:58:18 crc kubenswrapper[4755]: I1006 08:58:18.922144 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" podStartSLOduration=2.478328416 podStartE2EDuration="2.922125648s" podCreationTimestamp="2025-10-06 08:58:16 +0000 UTC" firstStartedPulling="2025-10-06 08:58:17.831010881 +0000 UTC m=+2154.660326095" lastFinishedPulling="2025-10-06 08:58:18.274808113 +0000 UTC m=+2155.104123327" observedRunningTime="2025-10-06 08:58:18.916278756 +0000 UTC m=+2155.745594010" watchObservedRunningTime="2025-10-06 08:58:18.922125648 +0000 UTC m=+2155.751440862" Oct 06 08:58:21 crc kubenswrapper[4755]: I1006 08:58:21.927408 4755 generic.go:334] "Generic (PLEG): container finished" podID="0f5c98c8-0bd9-4a85-b547-9c39183abe87" containerID="15a141fa74cf786ca780e3d77de9d2bbb58d3552c60e7279d1a004ca5c76c611" exitCode=0 Oct 06 08:58:21 crc kubenswrapper[4755]: I1006 08:58:21.927532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" event={"ID":"0f5c98c8-0bd9-4a85-b547-9c39183abe87","Type":"ContainerDied","Data":"15a141fa74cf786ca780e3d77de9d2bbb58d3552c60e7279d1a004ca5c76c611"} Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.329906 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.409657 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory\") pod \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.409735 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw28w\" (UniqueName: \"kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w\") pod \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.409765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph\") pod \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.410065 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key\") pod \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\" (UID: \"0f5c98c8-0bd9-4a85-b547-9c39183abe87\") " Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.416129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w" (OuterVolumeSpecName: "kube-api-access-dw28w") pod "0f5c98c8-0bd9-4a85-b547-9c39183abe87" (UID: "0f5c98c8-0bd9-4a85-b547-9c39183abe87"). InnerVolumeSpecName "kube-api-access-dw28w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.422884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph" (OuterVolumeSpecName: "ceph") pod "0f5c98c8-0bd9-4a85-b547-9c39183abe87" (UID: "0f5c98c8-0bd9-4a85-b547-9c39183abe87"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.438524 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory" (OuterVolumeSpecName: "inventory") pod "0f5c98c8-0bd9-4a85-b547-9c39183abe87" (UID: "0f5c98c8-0bd9-4a85-b547-9c39183abe87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.440760 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f5c98c8-0bd9-4a85-b547-9c39183abe87" (UID: "0f5c98c8-0bd9-4a85-b547-9c39183abe87"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.512318 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.512370 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.512382 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw28w\" (UniqueName: \"kubernetes.io/projected/0f5c98c8-0bd9-4a85-b547-9c39183abe87-kube-api-access-dw28w\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.512395 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5c98c8-0bd9-4a85-b547-9c39183abe87-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.945154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" event={"ID":"0f5c98c8-0bd9-4a85-b547-9c39183abe87","Type":"ContainerDied","Data":"2f4c60797c24a1783eaaa91e7cc27ecf91b145be894d4d5a213196ca885bf2e5"} Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.945777 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4c60797c24a1783eaaa91e7cc27ecf91b145be894d4d5a213196ca885bf2e5" Oct 06 08:58:23 crc kubenswrapper[4755]: I1006 08:58:23.945277 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.022920 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l"] Oct 06 08:58:24 crc kubenswrapper[4755]: E1006 08:58:24.023287 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5c98c8-0bd9-4a85-b547-9c39183abe87" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.023304 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5c98c8-0bd9-4a85-b547-9c39183abe87" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.023488 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5c98c8-0bd9-4a85-b547-9c39183abe87" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.024126 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.027033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.027056 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.028473 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.029095 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.029287 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.053613 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l"] Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.123925 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.124065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsg54\" (UniqueName: \"kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.124097 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.124210 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.225770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.225896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsg54\" (UniqueName: \"kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.225935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.226482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.229382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.229522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.230077 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.243135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsg54\" (UniqueName: \"kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ck77l\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.342184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.865969 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l"] Oct 06 08:58:24 crc kubenswrapper[4755]: I1006 08:58:24.955805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" event={"ID":"5f68ca6b-cc42-460a-9490-b29b87004e16","Type":"ContainerStarted","Data":"0d852e3f61abe77b1e637294b57c1e304f2723a116e89a121ea4d48d13adfeeb"} Oct 06 08:58:25 crc kubenswrapper[4755]: I1006 08:58:25.971500 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" event={"ID":"5f68ca6b-cc42-460a-9490-b29b87004e16","Type":"ContainerStarted","Data":"836beb80cd104c6c531f1720c634585a7205f3c3634fe7219bb0cbd6831a87fc"} Oct 06 08:58:25 crc kubenswrapper[4755]: I1006 08:58:25.997582 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" podStartSLOduration=1.6084413180000001 podStartE2EDuration="1.997541809s" podCreationTimestamp="2025-10-06 08:58:24 +0000 UTC" firstStartedPulling="2025-10-06 08:58:24.873991782 +0000 UTC m=+2161.703306986" lastFinishedPulling="2025-10-06 08:58:25.263092263 +0000 UTC m=+2162.092407477" observedRunningTime="2025-10-06 08:58:25.995286834 +0000 UTC m=+2162.824602048" watchObservedRunningTime="2025-10-06 08:58:25.997541809 +0000 UTC m=+2162.826857043" Oct 06 08:58:48 crc kubenswrapper[4755]: I1006 08:58:48.912373 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:58:48 crc kubenswrapper[4755]: I1006 08:58:48.912869 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:58:48 crc kubenswrapper[4755]: I1006 08:58:48.912909 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 08:58:48 crc kubenswrapper[4755]: I1006 08:58:48.913515 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:58:48 crc kubenswrapper[4755]: I1006 08:58:48.913585 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da" gracePeriod=600 Oct 06 08:58:49 crc kubenswrapper[4755]: I1006 08:58:49.173071 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da" exitCode=0 Oct 06 08:58:49 crc kubenswrapper[4755]: I1006 08:58:49.173130 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da"} Oct 06 08:58:49 crc kubenswrapper[4755]: I1006 08:58:49.173171 4755 scope.go:117] "RemoveContainer" containerID="010bb7a6238dc2ad4d9d12c3e5f67fe5050315a5d1b981dacfc5d79a362e7b73" Oct 06 08:58:50 crc kubenswrapper[4755]: I1006 08:58:50.183246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a"} Oct 06 08:59:06 crc kubenswrapper[4755]: I1006 08:59:06.359696 4755 generic.go:334] "Generic (PLEG): container finished" podID="5f68ca6b-cc42-460a-9490-b29b87004e16" containerID="836beb80cd104c6c531f1720c634585a7205f3c3634fe7219bb0cbd6831a87fc" exitCode=0 Oct 06 08:59:06 crc kubenswrapper[4755]: I1006 08:59:06.359776 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" event={"ID":"5f68ca6b-cc42-460a-9490-b29b87004e16","Type":"ContainerDied","Data":"836beb80cd104c6c531f1720c634585a7205f3c3634fe7219bb0cbd6831a87fc"} Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.801691 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.930315 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph\") pod \"5f68ca6b-cc42-460a-9490-b29b87004e16\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.930695 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory\") pod \"5f68ca6b-cc42-460a-9490-b29b87004e16\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.930803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key\") pod \"5f68ca6b-cc42-460a-9490-b29b87004e16\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.930843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsg54\" (UniqueName: \"kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54\") pod \"5f68ca6b-cc42-460a-9490-b29b87004e16\" (UID: \"5f68ca6b-cc42-460a-9490-b29b87004e16\") " Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.936226 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph" (OuterVolumeSpecName: "ceph") pod "5f68ca6b-cc42-460a-9490-b29b87004e16" (UID: "5f68ca6b-cc42-460a-9490-b29b87004e16"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.939746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54" (OuterVolumeSpecName: "kube-api-access-gsg54") pod "5f68ca6b-cc42-460a-9490-b29b87004e16" (UID: "5f68ca6b-cc42-460a-9490-b29b87004e16"). InnerVolumeSpecName "kube-api-access-gsg54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.965686 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory" (OuterVolumeSpecName: "inventory") pod "5f68ca6b-cc42-460a-9490-b29b87004e16" (UID: "5f68ca6b-cc42-460a-9490-b29b87004e16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:07 crc kubenswrapper[4755]: I1006 08:59:07.967726 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f68ca6b-cc42-460a-9490-b29b87004e16" (UID: "5f68ca6b-cc42-460a-9490-b29b87004e16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.033728 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.033763 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.033775 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f68ca6b-cc42-460a-9490-b29b87004e16-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.033784 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsg54\" (UniqueName: \"kubernetes.io/projected/5f68ca6b-cc42-460a-9490-b29b87004e16-kube-api-access-gsg54\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.389271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" event={"ID":"5f68ca6b-cc42-460a-9490-b29b87004e16","Type":"ContainerDied","Data":"0d852e3f61abe77b1e637294b57c1e304f2723a116e89a121ea4d48d13adfeeb"} Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.389313 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d852e3f61abe77b1e637294b57c1e304f2723a116e89a121ea4d48d13adfeeb" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.389419 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ck77l" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.481691 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-64v5h"] Oct 06 08:59:08 crc kubenswrapper[4755]: E1006 08:59:08.482223 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f68ca6b-cc42-460a-9490-b29b87004e16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.482252 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f68ca6b-cc42-460a-9490-b29b87004e16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.482505 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f68ca6b-cc42-460a-9490-b29b87004e16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.483257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.486945 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.487068 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.487189 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.487411 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.487434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.503622 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-64v5h"] Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.650277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2jb\" (UniqueName: \"kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.650479 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.650690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.650757 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.752527 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2jb\" (UniqueName: \"kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.752646 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.752713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.752742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.759046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.760075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.760544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.774146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2jb\" (UniqueName: \"kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb\") pod \"ssh-known-hosts-edpm-deployment-64v5h\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:08 crc kubenswrapper[4755]: I1006 08:59:08.857944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:09 crc kubenswrapper[4755]: I1006 08:59:09.351788 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-64v5h"] Oct 06 08:59:09 crc kubenswrapper[4755]: W1006 08:59:09.359274 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73b888de_77c2_4fbf_a443_37ce0a5c28d3.slice/crio-f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b WatchSource:0}: Error finding container f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b: Status 404 returned error can't find the container with id f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b Oct 06 08:59:09 crc kubenswrapper[4755]: I1006 08:59:09.397545 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" event={"ID":"73b888de-77c2-4fbf-a443-37ce0a5c28d3","Type":"ContainerStarted","Data":"f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b"} Oct 06 08:59:10 crc kubenswrapper[4755]: I1006 08:59:10.406840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" event={"ID":"73b888de-77c2-4fbf-a443-37ce0a5c28d3","Type":"ContainerStarted","Data":"5f80c55997d512a405104e72ade2038fbd86ceae0560720455b0f5c3fb29d504"} Oct 06 08:59:10 crc kubenswrapper[4755]: I1006 08:59:10.428512 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" podStartSLOduration=2.010581805 podStartE2EDuration="2.428489316s" podCreationTimestamp="2025-10-06 08:59:08 +0000 UTC" firstStartedPulling="2025-10-06 08:59:09.361871845 +0000 UTC m=+2206.191187059" lastFinishedPulling="2025-10-06 08:59:09.779779356 +0000 UTC m=+2206.609094570" observedRunningTime="2025-10-06 08:59:10.421196869 +0000 UTC m=+2207.250512093" watchObservedRunningTime="2025-10-06 08:59:10.428489316 +0000 UTC m=+2207.257804540" Oct 06 08:59:19 crc kubenswrapper[4755]: I1006 08:59:19.486843 4755 generic.go:334] "Generic (PLEG): container finished" podID="73b888de-77c2-4fbf-a443-37ce0a5c28d3" containerID="5f80c55997d512a405104e72ade2038fbd86ceae0560720455b0f5c3fb29d504" exitCode=0 Oct 06 08:59:19 crc kubenswrapper[4755]: I1006 08:59:19.486922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" event={"ID":"73b888de-77c2-4fbf-a443-37ce0a5c28d3","Type":"ContainerDied","Data":"5f80c55997d512a405104e72ade2038fbd86ceae0560720455b0f5c3fb29d504"} Oct 06 08:59:20 crc kubenswrapper[4755]: I1006 08:59:20.930070 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.109727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq2jb\" (UniqueName: \"kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb\") pod \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.110812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph\") pod \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.110996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0\") pod \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.111096 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam\") pod \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\" (UID: \"73b888de-77c2-4fbf-a443-37ce0a5c28d3\") " Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.116007 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph" (OuterVolumeSpecName: "ceph") pod "73b888de-77c2-4fbf-a443-37ce0a5c28d3" (UID: "73b888de-77c2-4fbf-a443-37ce0a5c28d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.116555 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb" (OuterVolumeSpecName: "kube-api-access-bq2jb") pod "73b888de-77c2-4fbf-a443-37ce0a5c28d3" (UID: "73b888de-77c2-4fbf-a443-37ce0a5c28d3"). InnerVolumeSpecName "kube-api-access-bq2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.137443 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "73b888de-77c2-4fbf-a443-37ce0a5c28d3" (UID: "73b888de-77c2-4fbf-a443-37ce0a5c28d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.138298 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "73b888de-77c2-4fbf-a443-37ce0a5c28d3" (UID: "73b888de-77c2-4fbf-a443-37ce0a5c28d3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.212988 4755 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.213035 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.213048 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq2jb\" (UniqueName: \"kubernetes.io/projected/73b888de-77c2-4fbf-a443-37ce0a5c28d3-kube-api-access-bq2jb\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.213057 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73b888de-77c2-4fbf-a443-37ce0a5c28d3-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.505851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" event={"ID":"73b888de-77c2-4fbf-a443-37ce0a5c28d3","Type":"ContainerDied","Data":"f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b"} Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.505892 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f581fac9d44bf0763599afa007298941794d07ec8aeadb2b186fbea1d63d0b6b" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.505914 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-64v5h" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.578544 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz"] Oct 06 08:59:21 crc kubenswrapper[4755]: E1006 08:59:21.578954 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b888de-77c2-4fbf-a443-37ce0a5c28d3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.578972 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b888de-77c2-4fbf-a443-37ce0a5c28d3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.579148 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b888de-77c2-4fbf-a443-37ce0a5c28d3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.579841 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.585399 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.585412 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.585513 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.585965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.586077 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.593249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz"] Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.722033 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.722120 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.722159 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.722230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5v9\" (UniqueName: \"kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.823464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.823529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5v9\" (UniqueName: \"kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.823619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.823675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.827018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.827298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.827434 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.844141 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5v9\" (UniqueName: \"kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qvnpz\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:21 crc kubenswrapper[4755]: I1006 08:59:21.894105 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:22 crc kubenswrapper[4755]: I1006 08:59:22.435379 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz"] Oct 06 08:59:22 crc kubenswrapper[4755]: I1006 08:59:22.516259 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" event={"ID":"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e","Type":"ContainerStarted","Data":"3b39116dc878f1bd4f9f01bc36cc8a35a3918d75b8b03e8f25c9a6ccd559c48e"} Oct 06 08:59:23 crc kubenswrapper[4755]: I1006 08:59:23.528672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" event={"ID":"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e","Type":"ContainerStarted","Data":"8ee8c87a58bf77168858657588f47b8f7c9b265ac8c15e47fe72221842dac4ef"} Oct 06 08:59:23 crc kubenswrapper[4755]: I1006 08:59:23.548104 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" podStartSLOduration=2.120268444 podStartE2EDuration="2.548085427s" podCreationTimestamp="2025-10-06 08:59:21 +0000 UTC" firstStartedPulling="2025-10-06 08:59:22.439213688 +0000 UTC m=+2219.268528902" lastFinishedPulling="2025-10-06 08:59:22.867030671 +0000 UTC m=+2219.696345885" observedRunningTime="2025-10-06 08:59:23.547012351 +0000 UTC m=+2220.376327565" watchObservedRunningTime="2025-10-06 08:59:23.548085427 +0000 UTC m=+2220.377400641" Oct 06 08:59:30 crc kubenswrapper[4755]: I1006 08:59:30.581220 4755 generic.go:334] "Generic (PLEG): container finished" podID="ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" containerID="8ee8c87a58bf77168858657588f47b8f7c9b265ac8c15e47fe72221842dac4ef" exitCode=0 Oct 06 08:59:30 crc kubenswrapper[4755]: I1006 08:59:30.581313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" event={"ID":"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e","Type":"ContainerDied","Data":"8ee8c87a58bf77168858657588f47b8f7c9b265ac8c15e47fe72221842dac4ef"} Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.062750 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.224419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns5v9\" (UniqueName: \"kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9\") pod \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.224532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key\") pod \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.224579 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph\") pod \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.224627 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory\") pod \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\" (UID: \"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e\") " Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.231809 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph" (OuterVolumeSpecName: "ceph") pod "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" (UID: "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.233829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9" (OuterVolumeSpecName: "kube-api-access-ns5v9") pod "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" (UID: "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e"). InnerVolumeSpecName "kube-api-access-ns5v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.248751 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" (UID: "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.259374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory" (OuterVolumeSpecName: "inventory") pod "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" (UID: "ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.326443 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns5v9\" (UniqueName: \"kubernetes.io/projected/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-kube-api-access-ns5v9\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.326468 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.326478 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.326486 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.602921 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" event={"ID":"ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e","Type":"ContainerDied","Data":"3b39116dc878f1bd4f9f01bc36cc8a35a3918d75b8b03e8f25c9a6ccd559c48e"} Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.602960 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b39116dc878f1bd4f9f01bc36cc8a35a3918d75b8b03e8f25c9a6ccd559c48e" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.602985 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qvnpz" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.698807 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk"] Oct 06 08:59:32 crc kubenswrapper[4755]: E1006 08:59:32.699370 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.699392 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.699616 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.700372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.703181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.703210 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.703297 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.703329 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.708126 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.711052 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk"] Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.836109 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.836214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5cs\" (UniqueName: \"kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.836275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.836330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.938304 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.938378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.938441 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.938491 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5cs\" (UniqueName: \"kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.942695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.943099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.944310 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:32 crc kubenswrapper[4755]: I1006 08:59:32.959433 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5cs\" (UniqueName: \"kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69svk\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:33 crc kubenswrapper[4755]: I1006 08:59:33.021176 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:33 crc kubenswrapper[4755]: I1006 08:59:33.560829 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk"] Oct 06 08:59:33 crc kubenswrapper[4755]: I1006 08:59:33.611818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" event={"ID":"603da326-0d3a-43a3-b32b-f02d46177a85","Type":"ContainerStarted","Data":"1818e316fa7bc6afd88c933cab18397f0ef0ef9967bd8113ddafab6ab9d0e33b"} Oct 06 08:59:34 crc kubenswrapper[4755]: I1006 08:59:34.622460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" event={"ID":"603da326-0d3a-43a3-b32b-f02d46177a85","Type":"ContainerStarted","Data":"0eba5f54da0f9de9a8daa7cf09c179447b58e55402dd60c10d4477426e799f68"} Oct 06 08:59:34 crc kubenswrapper[4755]: I1006 08:59:34.644529 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" podStartSLOduration=2.137862865 podStartE2EDuration="2.644510366s" podCreationTimestamp="2025-10-06 08:59:32 +0000 UTC" firstStartedPulling="2025-10-06 08:59:33.565390371 +0000 UTC m=+2230.394705585" lastFinishedPulling="2025-10-06 08:59:34.072037832 +0000 UTC m=+2230.901353086" observedRunningTime="2025-10-06 08:59:34.639385482 +0000 UTC m=+2231.468700696" watchObservedRunningTime="2025-10-06 08:59:34.644510366 +0000 UTC m=+2231.473825580" Oct 06 08:59:43 crc kubenswrapper[4755]: I1006 08:59:43.691865 4755 generic.go:334] "Generic (PLEG): container finished" podID="603da326-0d3a-43a3-b32b-f02d46177a85" containerID="0eba5f54da0f9de9a8daa7cf09c179447b58e55402dd60c10d4477426e799f68" exitCode=0 Oct 06 08:59:43 crc kubenswrapper[4755]: I1006 08:59:43.691914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" event={"ID":"603da326-0d3a-43a3-b32b-f02d46177a85","Type":"ContainerDied","Data":"0eba5f54da0f9de9a8daa7cf09c179447b58e55402dd60c10d4477426e799f68"} Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.125986 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.267392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key\") pod \"603da326-0d3a-43a3-b32b-f02d46177a85\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.268092 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn5cs\" (UniqueName: \"kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs\") pod \"603da326-0d3a-43a3-b32b-f02d46177a85\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.268299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory\") pod \"603da326-0d3a-43a3-b32b-f02d46177a85\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.268419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph\") pod \"603da326-0d3a-43a3-b32b-f02d46177a85\" (UID: \"603da326-0d3a-43a3-b32b-f02d46177a85\") " Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.274897 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs" (OuterVolumeSpecName: "kube-api-access-wn5cs") pod "603da326-0d3a-43a3-b32b-f02d46177a85" (UID: "603da326-0d3a-43a3-b32b-f02d46177a85"). InnerVolumeSpecName "kube-api-access-wn5cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.276653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph" (OuterVolumeSpecName: "ceph") pod "603da326-0d3a-43a3-b32b-f02d46177a85" (UID: "603da326-0d3a-43a3-b32b-f02d46177a85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.295030 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "603da326-0d3a-43a3-b32b-f02d46177a85" (UID: "603da326-0d3a-43a3-b32b-f02d46177a85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.307038 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory" (OuterVolumeSpecName: "inventory") pod "603da326-0d3a-43a3-b32b-f02d46177a85" (UID: "603da326-0d3a-43a3-b32b-f02d46177a85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.371315 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.371351 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn5cs\" (UniqueName: \"kubernetes.io/projected/603da326-0d3a-43a3-b32b-f02d46177a85-kube-api-access-wn5cs\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.371364 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.371373 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603da326-0d3a-43a3-b32b-f02d46177a85-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.718997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" event={"ID":"603da326-0d3a-43a3-b32b-f02d46177a85","Type":"ContainerDied","Data":"1818e316fa7bc6afd88c933cab18397f0ef0ef9967bd8113ddafab6ab9d0e33b"} Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.719057 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1818e316fa7bc6afd88c933cab18397f0ef0ef9967bd8113ddafab6ab9d0e33b" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.719188 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69svk" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.808419 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j"] Oct 06 08:59:45 crc kubenswrapper[4755]: E1006 08:59:45.809054 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603da326-0d3a-43a3-b32b-f02d46177a85" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.809121 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="603da326-0d3a-43a3-b32b-f02d46177a85" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.809349 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="603da326-0d3a-43a3-b32b-f02d46177a85" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.810037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814709 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814764 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814723 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814675 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.814539 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.815307 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.815651 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.825911 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j"] Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.884931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmdmh\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.884997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885033 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885379 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885651 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.885923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.886015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.886101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.886181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989814 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989908 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmdmh\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.989992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.990015 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.990094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.990120 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.990140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.990159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.995237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.995360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.996114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.996203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.998340 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:45 crc kubenswrapper[4755]: I1006 08:59:45.999184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:45.999966 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.000641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.001005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.001922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.004256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.004952 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.009549 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmdmh\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ql54j\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.144664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 08:59:46 crc kubenswrapper[4755]: W1006 08:59:46.668620 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23de9f2b_8aa9_4d9c_905a_e317082fffc8.slice/crio-54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc WatchSource:0}: Error finding container 54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc: Status 404 returned error can't find the container with id 54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.668650 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j"] Oct 06 08:59:46 crc kubenswrapper[4755]: I1006 08:59:46.750881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" event={"ID":"23de9f2b-8aa9-4d9c-905a-e317082fffc8","Type":"ContainerStarted","Data":"54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc"} Oct 06 08:59:47 crc kubenswrapper[4755]: I1006 08:59:47.762080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" event={"ID":"23de9f2b-8aa9-4d9c-905a-e317082fffc8","Type":"ContainerStarted","Data":"1440c6dbd3dff391db23e33f0e0462b9efeda9c0b562a0f0487c29d7f1473d50"} Oct 06 08:59:47 crc kubenswrapper[4755]: I1006 08:59:47.786529 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" podStartSLOduration=2.098713511 podStartE2EDuration="2.786502512s" podCreationTimestamp="2025-10-06 08:59:45 +0000 UTC" firstStartedPulling="2025-10-06 08:59:46.67051468 +0000 UTC m=+2243.499829894" lastFinishedPulling="2025-10-06 08:59:47.358303671 +0000 UTC m=+2244.187618895" observedRunningTime="2025-10-06 08:59:47.782420943 +0000 UTC m=+2244.611736177" watchObservedRunningTime="2025-10-06 08:59:47.786502512 +0000 UTC m=+2244.615817726" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.548920 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.551161 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.561919 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.618317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.618411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgd8\" (UniqueName: \"kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.618764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.720683 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.720797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.720848 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgd8\" (UniqueName: \"kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.721717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.721741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.740590 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgd8\" (UniqueName: \"kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8\") pod \"redhat-operators-p6hzb\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:51 crc kubenswrapper[4755]: I1006 08:59:51.876310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 08:59:52 crc kubenswrapper[4755]: I1006 08:59:52.424748 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 08:59:52 crc kubenswrapper[4755]: W1006 08:59:52.428550 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbafa383d_7eab_4a8a_aca6_44f58c03742a.slice/crio-bc0cf19b7c08c72d48113d3c0b35273b89f2d3410192c656fb3b871f7c154588 WatchSource:0}: Error finding container bc0cf19b7c08c72d48113d3c0b35273b89f2d3410192c656fb3b871f7c154588: Status 404 returned error can't find the container with id bc0cf19b7c08c72d48113d3c0b35273b89f2d3410192c656fb3b871f7c154588 Oct 06 08:59:52 crc kubenswrapper[4755]: I1006 08:59:52.820127 4755 generic.go:334] "Generic (PLEG): container finished" podID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerID="e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717" exitCode=0 Oct 06 08:59:52 crc kubenswrapper[4755]: I1006 08:59:52.820239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerDied","Data":"e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717"} Oct 06 08:59:52 crc kubenswrapper[4755]: I1006 08:59:52.820439 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerStarted","Data":"bc0cf19b7c08c72d48113d3c0b35273b89f2d3410192c656fb3b871f7c154588"} Oct 06 08:59:54 crc kubenswrapper[4755]: I1006 08:59:54.840894 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerDied","Data":"80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00"} Oct 06 08:59:54 crc kubenswrapper[4755]: I1006 08:59:54.841240 4755 generic.go:334] "Generic (PLEG): container finished" podID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerID="80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00" exitCode=0 Oct 06 08:59:55 crc kubenswrapper[4755]: I1006 08:59:55.851712 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerStarted","Data":"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8"} Oct 06 08:59:55 crc kubenswrapper[4755]: I1006 08:59:55.874688 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6hzb" podStartSLOduration=2.167875861 podStartE2EDuration="4.874667993s" podCreationTimestamp="2025-10-06 08:59:51 +0000 UTC" firstStartedPulling="2025-10-06 08:59:52.822217648 +0000 UTC m=+2249.651532862" lastFinishedPulling="2025-10-06 08:59:55.52900978 +0000 UTC m=+2252.358324994" observedRunningTime="2025-10-06 08:59:55.867649632 +0000 UTC m=+2252.696964856" watchObservedRunningTime="2025-10-06 08:59:55.874667993 +0000 UTC m=+2252.703983207" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.137264 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w"] Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.139183 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.141510 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.141901 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.149505 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w"] Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.205063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcp2w\" (UniqueName: \"kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.205216 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.205266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.307496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.307547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcp2w\" (UniqueName: \"kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.307818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.308709 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.324783 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcp2w\" (UniqueName: \"kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.327021 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume\") pod \"collect-profiles-29329020-nh86w\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.467255 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.883356 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w"] Oct 06 09:00:00 crc kubenswrapper[4755]: W1006 09:00:00.886331 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74701c31_c0a3_4577_af42_b53554531f90.slice/crio-4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0 WatchSource:0}: Error finding container 4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0: Status 404 returned error can't find the container with id 4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0 Oct 06 09:00:00 crc kubenswrapper[4755]: I1006 09:00:00.902313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" event={"ID":"74701c31-c0a3-4577-af42-b53554531f90","Type":"ContainerStarted","Data":"4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0"} Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.876748 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.878426 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.913619 4755 generic.go:334] "Generic (PLEG): container finished" podID="74701c31-c0a3-4577-af42-b53554531f90" containerID="0182e6d9781363873e34d02731c8fa0d3dbdd5b5f76053722da80ed2fbb5029d" exitCode=0 Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.913660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" event={"ID":"74701c31-c0a3-4577-af42-b53554531f90","Type":"ContainerDied","Data":"0182e6d9781363873e34d02731c8fa0d3dbdd5b5f76053722da80ed2fbb5029d"} Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.926627 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:01 crc kubenswrapper[4755]: I1006 09:00:01.975190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:02 crc kubenswrapper[4755]: I1006 09:00:02.166442 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.249941 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.362419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume\") pod \"74701c31-c0a3-4577-af42-b53554531f90\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.362496 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume\") pod \"74701c31-c0a3-4577-af42-b53554531f90\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.362630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcp2w\" (UniqueName: \"kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w\") pod \"74701c31-c0a3-4577-af42-b53554531f90\" (UID: \"74701c31-c0a3-4577-af42-b53554531f90\") " Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.363153 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume" (OuterVolumeSpecName: "config-volume") pod "74701c31-c0a3-4577-af42-b53554531f90" (UID: "74701c31-c0a3-4577-af42-b53554531f90"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.368738 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w" (OuterVolumeSpecName: "kube-api-access-jcp2w") pod "74701c31-c0a3-4577-af42-b53554531f90" (UID: "74701c31-c0a3-4577-af42-b53554531f90"). InnerVolumeSpecName "kube-api-access-jcp2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.368849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74701c31-c0a3-4577-af42-b53554531f90" (UID: "74701c31-c0a3-4577-af42-b53554531f90"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.464374 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcp2w\" (UniqueName: \"kubernetes.io/projected/74701c31-c0a3-4577-af42-b53554531f90-kube-api-access-jcp2w\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.464405 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74701c31-c0a3-4577-af42-b53554531f90-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.464415 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74701c31-c0a3-4577-af42-b53554531f90-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.930091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" event={"ID":"74701c31-c0a3-4577-af42-b53554531f90","Type":"ContainerDied","Data":"4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0"} Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.930326 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f8db9f5010566a101f17d6bf053c65d495f08c9fc82e47ed3761cfd81fd4dd0" Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.930254 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6hzb" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="registry-server" containerID="cri-o://23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8" gracePeriod=2 Oct 06 09:00:03 crc kubenswrapper[4755]: I1006 09:00:03.930655 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.328011 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4"] Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.333285 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-gtck4"] Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.352949 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.482871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content\") pod \"bafa383d-7eab-4a8a-aca6-44f58c03742a\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.482947 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities\") pod \"bafa383d-7eab-4a8a-aca6-44f58c03742a\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.483002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdgd8\" (UniqueName: \"kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8\") pod \"bafa383d-7eab-4a8a-aca6-44f58c03742a\" (UID: \"bafa383d-7eab-4a8a-aca6-44f58c03742a\") " Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.483793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities" (OuterVolumeSpecName: "utilities") pod "bafa383d-7eab-4a8a-aca6-44f58c03742a" (UID: "bafa383d-7eab-4a8a-aca6-44f58c03742a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.487164 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8" (OuterVolumeSpecName: "kube-api-access-pdgd8") pod "bafa383d-7eab-4a8a-aca6-44f58c03742a" (UID: "bafa383d-7eab-4a8a-aca6-44f58c03742a"). InnerVolumeSpecName "kube-api-access-pdgd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.584756 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.584794 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdgd8\" (UniqueName: \"kubernetes.io/projected/bafa383d-7eab-4a8a-aca6-44f58c03742a-kube-api-access-pdgd8\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.938379 4755 generic.go:334] "Generic (PLEG): container finished" podID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerID="23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8" exitCode=0 Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.938461 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6hzb" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.938461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerDied","Data":"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8"} Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.938883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6hzb" event={"ID":"bafa383d-7eab-4a8a-aca6-44f58c03742a","Type":"ContainerDied","Data":"bc0cf19b7c08c72d48113d3c0b35273b89f2d3410192c656fb3b871f7c154588"} Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.938904 4755 scope.go:117] "RemoveContainer" containerID="23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.956535 4755 scope.go:117] "RemoveContainer" containerID="80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00" Oct 06 09:00:04 crc kubenswrapper[4755]: I1006 09:00:04.978811 4755 scope.go:117] "RemoveContainer" containerID="e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.012718 4755 scope.go:117] "RemoveContainer" containerID="23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8" Oct 06 09:00:05 crc kubenswrapper[4755]: E1006 09:00:05.013153 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8\": container with ID starting with 23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8 not found: ID does not exist" containerID="23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.013199 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8"} err="failed to get container status \"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8\": rpc error: code = NotFound desc = could not find container \"23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8\": container with ID starting with 23ac2597d802221078e6494575667f840aa4229c3aadfcf71bb8cd863c1d25f8 not found: ID does not exist" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.013228 4755 scope.go:117] "RemoveContainer" containerID="80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00" Oct 06 09:00:05 crc kubenswrapper[4755]: E1006 09:00:05.013782 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00\": container with ID starting with 80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00 not found: ID does not exist" containerID="80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.013842 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00"} err="failed to get container status \"80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00\": rpc error: code = NotFound desc = could not find container \"80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00\": container with ID starting with 80bd46b5875ebc0d1505da0dd7dd0f5bc455e7421726bec4d633801aa5cb8e00 not found: ID does not exist" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.013875 4755 scope.go:117] "RemoveContainer" containerID="e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717" Oct 06 09:00:05 crc kubenswrapper[4755]: E1006 09:00:05.014153 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717\": container with ID starting with e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717 not found: ID does not exist" containerID="e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.014197 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717"} err="failed to get container status \"e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717\": rpc error: code = NotFound desc = could not find container \"e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717\": container with ID starting with e9c2ac1a45427f29cf2293350d5a9f0830c9cc40a3b9747dc4ba8b0d2dad3717 not found: ID does not exist" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.310914 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bafa383d-7eab-4a8a-aca6-44f58c03742a" (UID: "bafa383d-7eab-4a8a-aca6-44f58c03742a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.397900 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafa383d-7eab-4a8a-aca6-44f58c03742a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.573649 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.580800 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6hzb"] Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.890598 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" path="/var/lib/kubelet/pods/bafa383d-7eab-4a8a-aca6-44f58c03742a/volumes" Oct 06 09:00:05 crc kubenswrapper[4755]: I1006 09:00:05.891707 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15c418e-734c-43df-b3e2-20619f626df3" path="/var/lib/kubelet/pods/c15c418e-734c-43df-b3e2-20619f626df3/volumes" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.552797 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:14 crc kubenswrapper[4755]: E1006 09:00:14.556174 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74701c31-c0a3-4577-af42-b53554531f90" containerName="collect-profiles" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.556298 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74701c31-c0a3-4577-af42-b53554531f90" containerName="collect-profiles" Oct 06 09:00:14 crc kubenswrapper[4755]: E1006 09:00:14.556377 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="extract-content" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.556448 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="extract-content" Oct 06 09:00:14 crc kubenswrapper[4755]: E1006 09:00:14.556510 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="registry-server" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.556624 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="registry-server" Oct 06 09:00:14 crc kubenswrapper[4755]: E1006 09:00:14.556722 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="extract-utilities" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.556797 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="extract-utilities" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.557072 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74701c31-c0a3-4577-af42-b53554531f90" containerName="collect-profiles" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.557149 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafa383d-7eab-4a8a-aca6-44f58c03742a" containerName="registry-server" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.561159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.570060 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.688695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slx6v\" (UniqueName: \"kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.688752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.688780 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.791041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slx6v\" (UniqueName: \"kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.791102 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.791130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.791810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.791851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.810297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slx6v\" (UniqueName: \"kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v\") pod \"community-operators-fbw9q\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:14 crc kubenswrapper[4755]: I1006 09:00:14.930892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:15 crc kubenswrapper[4755]: I1006 09:00:15.383392 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:16 crc kubenswrapper[4755]: I1006 09:00:16.035073 4755 generic.go:334] "Generic (PLEG): container finished" podID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerID="0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5" exitCode=0 Oct 06 09:00:16 crc kubenswrapper[4755]: I1006 09:00:16.035212 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerDied","Data":"0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5"} Oct 06 09:00:16 crc kubenswrapper[4755]: I1006 09:00:16.036028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerStarted","Data":"cb8a29b8edc93ecfe3b0b91b5069332d24ed9a018792fcbe8b1707a8e2f95227"} Oct 06 09:00:16 crc kubenswrapper[4755]: I1006 09:00:16.039630 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:00:17 crc kubenswrapper[4755]: I1006 09:00:17.055229 4755 generic.go:334] "Generic (PLEG): container finished" podID="23de9f2b-8aa9-4d9c-905a-e317082fffc8" containerID="1440c6dbd3dff391db23e33f0e0462b9efeda9c0b562a0f0487c29d7f1473d50" exitCode=0 Oct 06 09:00:17 crc kubenswrapper[4755]: I1006 09:00:17.055279 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" event={"ID":"23de9f2b-8aa9-4d9c-905a-e317082fffc8","Type":"ContainerDied","Data":"1440c6dbd3dff391db23e33f0e0462b9efeda9c0b562a0f0487c29d7f1473d50"} Oct 06 09:00:17 crc kubenswrapper[4755]: I1006 09:00:17.058018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerStarted","Data":"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee"} Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.066434 4755 generic.go:334] "Generic (PLEG): container finished" podID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerID="8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee" exitCode=0 Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.067643 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerDied","Data":"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee"} Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.447938 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558431 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558468 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmdmh\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558599 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558620 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558750 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558820 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.558958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph\") pod \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\" (UID: \"23de9f2b-8aa9-4d9c-905a-e317082fffc8\") " Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567871 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.567992 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.568221 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.568905 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh" (OuterVolumeSpecName: "kube-api-access-dmdmh") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "kube-api-access-dmdmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.569026 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.569386 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.573257 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph" (OuterVolumeSpecName: "ceph") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.591633 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory" (OuterVolumeSpecName: "inventory") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.594964 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23de9f2b-8aa9-4d9c-905a-e317082fffc8" (UID: "23de9f2b-8aa9-4d9c-905a-e317082fffc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661296 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661330 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661344 4755 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661376 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661388 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661397 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661405 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661415 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmdmh\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-kube-api-access-dmdmh\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661428 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661475 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23de9f2b-8aa9-4d9c-905a-e317082fffc8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661489 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661497 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4755]: I1006 09:00:18.661505 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23de9f2b-8aa9-4d9c-905a-e317082fffc8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.077944 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerStarted","Data":"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e"} Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.080792 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" event={"ID":"23de9f2b-8aa9-4d9c-905a-e317082fffc8","Type":"ContainerDied","Data":"54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc"} Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.080853 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e9e2737287d4ef8ea2000e04f289ea769ffea62193fed198a55ffba82a39cc" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.080906 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ql54j" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.103720 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fbw9q" podStartSLOduration=2.627526593 podStartE2EDuration="5.1036917s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="2025-10-06 09:00:16.039321095 +0000 UTC m=+2272.868636309" lastFinishedPulling="2025-10-06 09:00:18.515486202 +0000 UTC m=+2275.344801416" observedRunningTime="2025-10-06 09:00:19.100506021 +0000 UTC m=+2275.929821235" watchObservedRunningTime="2025-10-06 09:00:19.1036917 +0000 UTC m=+2275.933006934" Oct 06 09:00:19 crc kubenswrapper[4755]: E1006 09:00:19.199412 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23de9f2b_8aa9_4d9c_905a_e317082fffc8.slice\": RecentStats: unable to find data in memory cache]" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.225189 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4"] Oct 06 09:00:19 crc kubenswrapper[4755]: E1006 09:00:19.225920 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23de9f2b-8aa9-4d9c-905a-e317082fffc8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.225944 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="23de9f2b-8aa9-4d9c-905a-e317082fffc8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.226170 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="23de9f2b-8aa9-4d9c-905a-e317082fffc8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.227400 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.232844 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.233139 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.233328 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.233320 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.233642 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.243150 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4"] Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.276602 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.276689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxlh\" (UniqueName: \"kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.276745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.276783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.378673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.378784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxlh\" (UniqueName: \"kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.378868 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.378924 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.387373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.388273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.392453 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.400165 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxlh\" (UniqueName: \"kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:19 crc kubenswrapper[4755]: I1006 09:00:19.556221 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:20 crc kubenswrapper[4755]: I1006 09:00:20.041951 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4"] Oct 06 09:00:20 crc kubenswrapper[4755]: W1006 09:00:20.045347 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d796d4f_17fe_44fc_a23b_8bb3a3dc71ac.slice/crio-4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b WatchSource:0}: Error finding container 4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b: Status 404 returned error can't find the container with id 4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b Oct 06 09:00:20 crc kubenswrapper[4755]: I1006 09:00:20.092910 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" event={"ID":"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac","Type":"ContainerStarted","Data":"4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b"} Oct 06 09:00:21 crc kubenswrapper[4755]: I1006 09:00:21.103628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" event={"ID":"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac","Type":"ContainerStarted","Data":"3e946f9d34302628830babcccd0e2b8e06f1d07afda97b597817616ce13f4182"} Oct 06 09:00:21 crc kubenswrapper[4755]: I1006 09:00:21.119969 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" podStartSLOduration=1.584940042 podStartE2EDuration="2.119950322s" podCreationTimestamp="2025-10-06 09:00:19 +0000 UTC" firstStartedPulling="2025-10-06 09:00:20.04746817 +0000 UTC m=+2276.876783384" lastFinishedPulling="2025-10-06 09:00:20.58247845 +0000 UTC m=+2277.411793664" observedRunningTime="2025-10-06 09:00:21.119734515 +0000 UTC m=+2277.949049739" watchObservedRunningTime="2025-10-06 09:00:21.119950322 +0000 UTC m=+2277.949265536" Oct 06 09:00:24 crc kubenswrapper[4755]: I1006 09:00:24.931189 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:24 crc kubenswrapper[4755]: I1006 09:00:24.931774 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:24 crc kubenswrapper[4755]: I1006 09:00:24.982323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:25 crc kubenswrapper[4755]: I1006 09:00:25.177394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:25 crc kubenswrapper[4755]: I1006 09:00:25.234209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:26 crc kubenswrapper[4755]: I1006 09:00:26.142090 4755 generic.go:334] "Generic (PLEG): container finished" podID="3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" containerID="3e946f9d34302628830babcccd0e2b8e06f1d07afda97b597817616ce13f4182" exitCode=0 Oct 06 09:00:26 crc kubenswrapper[4755]: I1006 09:00:26.142834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" event={"ID":"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac","Type":"ContainerDied","Data":"3e946f9d34302628830babcccd0e2b8e06f1d07afda97b597817616ce13f4182"} Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.149086 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fbw9q" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="registry-server" containerID="cri-o://36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e" gracePeriod=2 Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.606019 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.613333 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647675 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph\") pod \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647795 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory\") pod \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxlh\" (UniqueName: \"kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh\") pod \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647879 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key\") pod \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\" (UID: \"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647954 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content\") pod \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.647986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slx6v\" (UniqueName: \"kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v\") pod \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.648058 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities\") pod \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\" (UID: \"d389a513-e90d-4760-bbb6-0886c9bb3c1f\") " Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.650198 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities" (OuterVolumeSpecName: "utilities") pod "d389a513-e90d-4760-bbb6-0886c9bb3c1f" (UID: "d389a513-e90d-4760-bbb6-0886c9bb3c1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.654335 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh" (OuterVolumeSpecName: "kube-api-access-wnxlh") pod "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" (UID: "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac"). InnerVolumeSpecName "kube-api-access-wnxlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.655839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v" (OuterVolumeSpecName: "kube-api-access-slx6v") pod "d389a513-e90d-4760-bbb6-0886c9bb3c1f" (UID: "d389a513-e90d-4760-bbb6-0886c9bb3c1f"). InnerVolumeSpecName "kube-api-access-slx6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.662487 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph" (OuterVolumeSpecName: "ceph") pod "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" (UID: "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.691048 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory" (OuterVolumeSpecName: "inventory") pod "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" (UID: "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.692499 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" (UID: "3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.706924 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d389a513-e90d-4760-bbb6-0886c9bb3c1f" (UID: "d389a513-e90d-4760-bbb6-0886c9bb3c1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751103 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751147 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751165 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxlh\" (UniqueName: \"kubernetes.io/projected/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-kube-api-access-wnxlh\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751177 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751192 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751205 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slx6v\" (UniqueName: \"kubernetes.io/projected/d389a513-e90d-4760-bbb6-0886c9bb3c1f-kube-api-access-slx6v\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:27 crc kubenswrapper[4755]: I1006 09:00:27.751216 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d389a513-e90d-4760-bbb6-0886c9bb3c1f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.159996 4755 generic.go:334] "Generic (PLEG): container finished" podID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerID="36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e" exitCode=0 Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.160081 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerDied","Data":"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e"} Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.160115 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbw9q" event={"ID":"d389a513-e90d-4760-bbb6-0886c9bb3c1f","Type":"ContainerDied","Data":"cb8a29b8edc93ecfe3b0b91b5069332d24ed9a018792fcbe8b1707a8e2f95227"} Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.160137 4755 scope.go:117] "RemoveContainer" containerID="36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.160325 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbw9q" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.181331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" event={"ID":"3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac","Type":"ContainerDied","Data":"4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b"} Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.181411 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc9bdd9ecc121ad885dc6e613f2898d7c3c87b0608773e538cd33b54784bc4b" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.181532 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.225627 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.235812 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fbw9q"] Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.250691 4755 scope.go:117] "RemoveContainer" containerID="8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.309515 4755 scope.go:117] "RemoveContainer" containerID="0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.331358 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w"] Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.332708 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="extract-utilities" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.332834 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="extract-utilities" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.332916 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.332980 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.333051 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="extract-content" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.333118 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="extract-content" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.333190 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="registry-server" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.333255 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="registry-server" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.333462 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.333540 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" containerName="registry-server" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.334270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.338212 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.338807 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.339371 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.339714 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.340948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.365186 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.368978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.369230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.369414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.369588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbzj\" (UniqueName: \"kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.369712 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.369888 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.389547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w"] Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.393314 4755 scope.go:117] "RemoveContainer" containerID="36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.396269 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e\": container with ID starting with 36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e not found: ID does not exist" containerID="36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.396409 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e"} err="failed to get container status \"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e\": rpc error: code = NotFound desc = could not find container \"36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e\": container with ID starting with 36009ef4437619cfb04181991a67e015a7366977a8efb6db34c7a6f7f21f614e not found: ID does not exist" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.396622 4755 scope.go:117] "RemoveContainer" containerID="8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.397195 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee\": container with ID starting with 8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee not found: ID does not exist" containerID="8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.397237 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee"} err="failed to get container status \"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee\": rpc error: code = NotFound desc = could not find container \"8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee\": container with ID starting with 8924582fc3ce90779cd1246894ef339545d72453876bf74edb896ca6cd5011ee not found: ID does not exist" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.397268 4755 scope.go:117] "RemoveContainer" containerID="0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5" Oct 06 09:00:28 crc kubenswrapper[4755]: E1006 09:00:28.397619 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5\": container with ID starting with 0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5 not found: ID does not exist" containerID="0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.397643 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5"} err="failed to get container status \"0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5\": rpc error: code = NotFound desc = could not find container \"0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5\": container with ID starting with 0cfaa52720bfb0405f299b5d00aeb84a83ba418c73005810c2169541dfb632b5 not found: ID does not exist" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471334 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbzj\" (UniqueName: \"kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471419 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.471483 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.472445 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.483768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.483990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.484903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.486717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.489955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbzj\" (UniqueName: \"kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhz9w\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:28 crc kubenswrapper[4755]: I1006 09:00:28.727382 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:00:29 crc kubenswrapper[4755]: I1006 09:00:29.225677 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w"] Oct 06 09:00:29 crc kubenswrapper[4755]: I1006 09:00:29.892287 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d389a513-e90d-4760-bbb6-0886c9bb3c1f" path="/var/lib/kubelet/pods/d389a513-e90d-4760-bbb6-0886c9bb3c1f/volumes" Oct 06 09:00:30 crc kubenswrapper[4755]: I1006 09:00:30.199450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" event={"ID":"c1401167-6476-4be8-8b96-e7c302f4d7f7","Type":"ContainerStarted","Data":"b351b9ba2e7a04a0ade82445753fcd03f426f21dda392a471905ed2ff509e20d"} Oct 06 09:00:30 crc kubenswrapper[4755]: I1006 09:00:30.199499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" event={"ID":"c1401167-6476-4be8-8b96-e7c302f4d7f7","Type":"ContainerStarted","Data":"aa4afe10532aa8409588bb33168bab63c8f5df342a2364821da3bbe4d2c5a4fd"} Oct 06 09:00:30 crc kubenswrapper[4755]: I1006 09:00:30.226378 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" podStartSLOduration=1.7095131609999998 podStartE2EDuration="2.226358676s" podCreationTimestamp="2025-10-06 09:00:28 +0000 UTC" firstStartedPulling="2025-10-06 09:00:29.231184603 +0000 UTC m=+2286.060499817" lastFinishedPulling="2025-10-06 09:00:29.748030118 +0000 UTC m=+2286.577345332" observedRunningTime="2025-10-06 09:00:30.220085222 +0000 UTC m=+2287.049400456" watchObservedRunningTime="2025-10-06 09:00:30.226358676 +0000 UTC m=+2287.055673880" Oct 06 09:00:41 crc kubenswrapper[4755]: I1006 09:00:41.310811 4755 scope.go:117] "RemoveContainer" containerID="0c87ae79e9b83043a5b0307bf5688d48bba48ad335a04dbf92126cf54b4deb80" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.175879 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329021-jcqs8"] Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.178380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.193723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329021-jcqs8"] Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.273620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.274007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.274058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68dd\" (UniqueName: \"kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.274111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.376262 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.376377 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.376414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h68dd\" (UniqueName: \"kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.376521 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.388460 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.388688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.388713 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.392903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68dd\" (UniqueName: \"kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd\") pod \"keystone-cron-29329021-jcqs8\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.498241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:00 crc kubenswrapper[4755]: I1006 09:01:00.925073 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329021-jcqs8"] Oct 06 09:01:01 crc kubenswrapper[4755]: I1006 09:01:01.459611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329021-jcqs8" event={"ID":"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314","Type":"ContainerStarted","Data":"602360f8137a04ae301fb9cb22ae9e8576e7a6e92bca38f740a8d850e661fbf9"} Oct 06 09:01:01 crc kubenswrapper[4755]: I1006 09:01:01.460137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329021-jcqs8" event={"ID":"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314","Type":"ContainerStarted","Data":"f9ad38f9531afa0ca420f77fa37a6a1a5475118c75fce723bbdffa42c941e7bc"} Oct 06 09:01:01 crc kubenswrapper[4755]: I1006 09:01:01.483496 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329021-jcqs8" podStartSLOduration=1.483477119 podStartE2EDuration="1.483477119s" podCreationTimestamp="2025-10-06 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:01.476967159 +0000 UTC m=+2318.306282383" watchObservedRunningTime="2025-10-06 09:01:01.483477119 +0000 UTC m=+2318.312792333" Oct 06 09:01:03 crc kubenswrapper[4755]: I1006 09:01:03.477389 4755 generic.go:334] "Generic (PLEG): container finished" podID="48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" containerID="602360f8137a04ae301fb9cb22ae9e8576e7a6e92bca38f740a8d850e661fbf9" exitCode=0 Oct 06 09:01:03 crc kubenswrapper[4755]: I1006 09:01:03.477466 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329021-jcqs8" event={"ID":"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314","Type":"ContainerDied","Data":"602360f8137a04ae301fb9cb22ae9e8576e7a6e92bca38f740a8d850e661fbf9"} Oct 06 09:01:04 crc kubenswrapper[4755]: I1006 09:01:04.891541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.079791 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h68dd\" (UniqueName: \"kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd\") pod \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.080038 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle\") pod \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.080119 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys\") pod \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.080226 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data\") pod \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\" (UID: \"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314\") " Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.090820 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" (UID: "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.091798 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd" (OuterVolumeSpecName: "kube-api-access-h68dd") pod "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" (UID: "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314"). InnerVolumeSpecName "kube-api-access-h68dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.114918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" (UID: "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.135591 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data" (OuterVolumeSpecName: "config-data") pod "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" (UID: "48ed59bc-4f30-4c9f-9c3a-9628f0a5b314"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.181893 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.181929 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.181938 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.181949 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h68dd\" (UniqueName: \"kubernetes.io/projected/48ed59bc-4f30-4c9f-9c3a-9628f0a5b314-kube-api-access-h68dd\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.504779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329021-jcqs8" event={"ID":"48ed59bc-4f30-4c9f-9c3a-9628f0a5b314","Type":"ContainerDied","Data":"f9ad38f9531afa0ca420f77fa37a6a1a5475118c75fce723bbdffa42c941e7bc"} Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.504822 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ad38f9531afa0ca420f77fa37a6a1a5475118c75fce723bbdffa42c941e7bc" Oct 06 09:01:05 crc kubenswrapper[4755]: I1006 09:01:05.504866 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329021-jcqs8" Oct 06 09:01:18 crc kubenswrapper[4755]: I1006 09:01:18.912094 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:01:18 crc kubenswrapper[4755]: I1006 09:01:18.912689 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:01:32 crc kubenswrapper[4755]: I1006 09:01:32.720235 4755 generic.go:334] "Generic (PLEG): container finished" podID="c1401167-6476-4be8-8b96-e7c302f4d7f7" containerID="b351b9ba2e7a04a0ade82445753fcd03f426f21dda392a471905ed2ff509e20d" exitCode=0 Oct 06 09:01:32 crc kubenswrapper[4755]: I1006 09:01:32.720306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" event={"ID":"c1401167-6476-4be8-8b96-e7c302f4d7f7","Type":"ContainerDied","Data":"b351b9ba2e7a04a0ade82445753fcd03f426f21dda392a471905ed2ff509e20d"} Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.176951 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.339807 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.339980 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbzj\" (UniqueName: \"kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.340015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.340055 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.340184 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.340204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key\") pod \"c1401167-6476-4be8-8b96-e7c302f4d7f7\" (UID: \"c1401167-6476-4be8-8b96-e7c302f4d7f7\") " Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.345718 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph" (OuterVolumeSpecName: "ceph") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.345978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.362488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj" (OuterVolumeSpecName: "kube-api-access-ssbzj") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "kube-api-access-ssbzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.365396 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.366225 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory" (OuterVolumeSpecName: "inventory") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.368085 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1401167-6476-4be8-8b96-e7c302f4d7f7" (UID: "c1401167-6476-4be8-8b96-e7c302f4d7f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442063 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442727 4755 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442782 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442803 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442825 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbzj\" (UniqueName: \"kubernetes.io/projected/c1401167-6476-4be8-8b96-e7c302f4d7f7-kube-api-access-ssbzj\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.442846 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1401167-6476-4be8-8b96-e7c302f4d7f7-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.738156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" event={"ID":"c1401167-6476-4be8-8b96-e7c302f4d7f7","Type":"ContainerDied","Data":"aa4afe10532aa8409588bb33168bab63c8f5df342a2364821da3bbe4d2c5a4fd"} Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.738195 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa4afe10532aa8409588bb33168bab63c8f5df342a2364821da3bbe4d2c5a4fd" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.738255 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhz9w" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.843894 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4"] Oct 06 09:01:34 crc kubenswrapper[4755]: E1006 09:01:34.844290 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1401167-6476-4be8-8b96-e7c302f4d7f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.844315 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1401167-6476-4be8-8b96-e7c302f4d7f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:01:34 crc kubenswrapper[4755]: E1006 09:01:34.844348 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" containerName="keystone-cron" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.844359 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" containerName="keystone-cron" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.844554 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1401167-6476-4be8-8b96-e7c302f4d7f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.844653 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ed59bc-4f30-4c9f-9c3a-9628f0a5b314" containerName="keystone-cron" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.845390 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849148 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849273 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849296 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47r8h\" (UniqueName: \"kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849325 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849353 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849431 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849632 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849702 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849804 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.849900 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.854273 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4"] Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951155 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47r8h\" (UniqueName: \"kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.951338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.955351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.956028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.956912 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.956999 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.957055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.958972 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:34 crc kubenswrapper[4755]: I1006 09:01:34.968679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47r8h\" (UniqueName: \"kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:35 crc kubenswrapper[4755]: I1006 09:01:35.162653 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:01:35 crc kubenswrapper[4755]: I1006 09:01:35.657428 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4"] Oct 06 09:01:35 crc kubenswrapper[4755]: I1006 09:01:35.746274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" event={"ID":"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2","Type":"ContainerStarted","Data":"28d5b03b9783185d8459730d8716a56f4d1ada9a180fe5fab9a14787378a7c58"} Oct 06 09:01:36 crc kubenswrapper[4755]: I1006 09:01:36.755235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" event={"ID":"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2","Type":"ContainerStarted","Data":"faa912bd4d76baaa6b8775a904f3e8e565882f0a5554a3bc2eeac3107a277490"} Oct 06 09:01:36 crc kubenswrapper[4755]: I1006 09:01:36.778288 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" podStartSLOduration=2.305634274 podStartE2EDuration="2.778268592s" podCreationTimestamp="2025-10-06 09:01:34 +0000 UTC" firstStartedPulling="2025-10-06 09:01:35.669162441 +0000 UTC m=+2352.498477655" lastFinishedPulling="2025-10-06 09:01:36.141796759 +0000 UTC m=+2352.971111973" observedRunningTime="2025-10-06 09:01:36.77251582 +0000 UTC m=+2353.601831054" watchObservedRunningTime="2025-10-06 09:01:36.778268592 +0000 UTC m=+2353.607583806" Oct 06 09:01:48 crc kubenswrapper[4755]: I1006 09:01:48.912334 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:01:48 crc kubenswrapper[4755]: I1006 09:01:48.913320 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:02:18 crc kubenswrapper[4755]: I1006 09:02:18.913479 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:02:18 crc kubenswrapper[4755]: I1006 09:02:18.914085 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:02:18 crc kubenswrapper[4755]: I1006 09:02:18.914136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:02:18 crc kubenswrapper[4755]: I1006 09:02:18.914946 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:02:18 crc kubenswrapper[4755]: I1006 09:02:18.915003 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" gracePeriod=600 Oct 06 09:02:19 crc kubenswrapper[4755]: E1006 09:02:19.039355 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:02:19 crc kubenswrapper[4755]: I1006 09:02:19.134073 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" exitCode=0 Oct 06 09:02:19 crc kubenswrapper[4755]: I1006 09:02:19.134117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a"} Oct 06 09:02:19 crc kubenswrapper[4755]: I1006 09:02:19.134150 4755 scope.go:117] "RemoveContainer" containerID="07b1bac86ef25134b8ebed154053528dffbc3145250e0269cad9a7970e57b7da" Oct 06 09:02:19 crc kubenswrapper[4755]: I1006 09:02:19.134826 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:02:19 crc kubenswrapper[4755]: E1006 09:02:19.135103 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:02:29 crc kubenswrapper[4755]: I1006 09:02:29.269278 4755 generic.go:334] "Generic (PLEG): container finished" podID="4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" containerID="faa912bd4d76baaa6b8775a904f3e8e565882f0a5554a3bc2eeac3107a277490" exitCode=0 Oct 06 09:02:29 crc kubenswrapper[4755]: I1006 09:02:29.269390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" event={"ID":"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2","Type":"ContainerDied","Data":"faa912bd4d76baaa6b8775a904f3e8e565882f0a5554a3bc2eeac3107a277490"} Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.769781 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.878920 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:02:30 crc kubenswrapper[4755]: E1006 09:02:30.879355 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906826 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906937 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.906986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47r8h\" (UniqueName: \"kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h\") pod \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\" (UID: \"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2\") " Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.913148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h" (OuterVolumeSpecName: "kube-api-access-47r8h") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "kube-api-access-47r8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.919303 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.919356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph" (OuterVolumeSpecName: "ceph") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.933107 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.934031 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.934420 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:30 crc kubenswrapper[4755]: I1006 09:02:30.938941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory" (OuterVolumeSpecName: "inventory") pod "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" (UID: "4c7b1eb8-1304-4cb2-aae7-e302a978c2c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009677 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009708 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47r8h\" (UniqueName: \"kubernetes.io/projected/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-kube-api-access-47r8h\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009720 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009731 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009739 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009747 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.009756 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c7b1eb8-1304-4cb2-aae7-e302a978c2c2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.317878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" event={"ID":"4c7b1eb8-1304-4cb2-aae7-e302a978c2c2","Type":"ContainerDied","Data":"28d5b03b9783185d8459730d8716a56f4d1ada9a180fe5fab9a14787378a7c58"} Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.319214 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d5b03b9783185d8459730d8716a56f4d1ada9a180fe5fab9a14787378a7c58" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.318037 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.399057 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l"] Oct 06 09:02:31 crc kubenswrapper[4755]: E1006 09:02:31.399897 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.399995 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.400232 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7b1eb8-1304-4cb2-aae7-e302a978c2c2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.401007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.403872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.403878 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.404067 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.405964 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.406116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.407132 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.411646 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l"] Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527419 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527485 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhrn2\" (UniqueName: \"kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527607 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.527658 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhrn2\" (UniqueName: \"kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630250 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630295 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.630399 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.634814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.634820 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.635663 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.635710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.639769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.647938 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhrn2\" (UniqueName: \"kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:31 crc kubenswrapper[4755]: I1006 09:02:31.718709 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:02:32 crc kubenswrapper[4755]: I1006 09:02:32.241803 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l"] Oct 06 09:02:32 crc kubenswrapper[4755]: I1006 09:02:32.329400 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" event={"ID":"89a25b96-fda5-4f63-bf99-550ba2c00701","Type":"ContainerStarted","Data":"ba25fcb7a772c04e7b1d47246824a3d7e8c248cdd93a6994487cb8f56eec924e"} Oct 06 09:02:33 crc kubenswrapper[4755]: I1006 09:02:33.340459 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" event={"ID":"89a25b96-fda5-4f63-bf99-550ba2c00701","Type":"ContainerStarted","Data":"be69b9c47f4f14d202eca97f44d23616fe0a217f361fad6d9a572231142af021"} Oct 06 09:02:33 crc kubenswrapper[4755]: I1006 09:02:33.370066 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" podStartSLOduration=1.934896985 podStartE2EDuration="2.370048853s" podCreationTimestamp="2025-10-06 09:02:31 +0000 UTC" firstStartedPulling="2025-10-06 09:02:32.235055527 +0000 UTC m=+2409.064370741" lastFinishedPulling="2025-10-06 09:02:32.670207395 +0000 UTC m=+2409.499522609" observedRunningTime="2025-10-06 09:02:33.369252004 +0000 UTC m=+2410.198567298" watchObservedRunningTime="2025-10-06 09:02:33.370048853 +0000 UTC m=+2410.199364067" Oct 06 09:02:41 crc kubenswrapper[4755]: I1006 09:02:41.878904 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:02:41 crc kubenswrapper[4755]: E1006 09:02:41.879644 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:02:56 crc kubenswrapper[4755]: I1006 09:02:56.879027 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:02:56 crc kubenswrapper[4755]: E1006 09:02:56.879864 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:03:08 crc kubenswrapper[4755]: I1006 09:03:08.878857 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:03:08 crc kubenswrapper[4755]: E1006 09:03:08.879699 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:03:19 crc kubenswrapper[4755]: I1006 09:03:19.879311 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:03:19 crc kubenswrapper[4755]: E1006 09:03:19.880743 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:03:31 crc kubenswrapper[4755]: I1006 09:03:31.878892 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:03:31 crc kubenswrapper[4755]: E1006 09:03:31.879799 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:03:46 crc kubenswrapper[4755]: I1006 09:03:46.878847 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:03:46 crc kubenswrapper[4755]: E1006 09:03:46.879760 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:04:00 crc kubenswrapper[4755]: I1006 09:04:00.879201 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:04:00 crc kubenswrapper[4755]: E1006 09:04:00.879915 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:04:12 crc kubenswrapper[4755]: I1006 09:04:12.878633 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:04:12 crc kubenswrapper[4755]: E1006 09:04:12.879459 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:04:25 crc kubenswrapper[4755]: I1006 09:04:25.878810 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:04:25 crc kubenswrapper[4755]: E1006 09:04:25.879502 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:04:39 crc kubenswrapper[4755]: I1006 09:04:39.880232 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:04:39 crc kubenswrapper[4755]: E1006 09:04:39.881663 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:04:50 crc kubenswrapper[4755]: I1006 09:04:50.878821 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:04:50 crc kubenswrapper[4755]: E1006 09:04:50.880642 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:05:04 crc kubenswrapper[4755]: I1006 09:05:04.878981 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:05:04 crc kubenswrapper[4755]: E1006 09:05:04.879786 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:05:17 crc kubenswrapper[4755]: I1006 09:05:17.879234 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:05:17 crc kubenswrapper[4755]: E1006 09:05:17.881603 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:05:31 crc kubenswrapper[4755]: I1006 09:05:31.879663 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:05:31 crc kubenswrapper[4755]: E1006 09:05:31.880479 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:05:44 crc kubenswrapper[4755]: I1006 09:05:44.879283 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:05:44 crc kubenswrapper[4755]: E1006 09:05:44.880630 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:05:57 crc kubenswrapper[4755]: I1006 09:05:57.879504 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:05:57 crc kubenswrapper[4755]: E1006 09:05:57.880220 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:06:10 crc kubenswrapper[4755]: I1006 09:06:10.880271 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:06:10 crc kubenswrapper[4755]: E1006 09:06:10.881223 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:06:22 crc kubenswrapper[4755]: I1006 09:06:22.879217 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:06:22 crc kubenswrapper[4755]: E1006 09:06:22.880158 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:06:34 crc kubenswrapper[4755]: I1006 09:06:34.878444 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:06:34 crc kubenswrapper[4755]: E1006 09:06:34.879190 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:06:38 crc kubenswrapper[4755]: I1006 09:06:38.526020 4755 generic.go:334] "Generic (PLEG): container finished" podID="89a25b96-fda5-4f63-bf99-550ba2c00701" containerID="be69b9c47f4f14d202eca97f44d23616fe0a217f361fad6d9a572231142af021" exitCode=0 Oct 06 09:06:38 crc kubenswrapper[4755]: I1006 09:06:38.526070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" event={"ID":"89a25b96-fda5-4f63-bf99-550ba2c00701","Type":"ContainerDied","Data":"be69b9c47f4f14d202eca97f44d23616fe0a217f361fad6d9a572231142af021"} Oct 06 09:06:39 crc kubenswrapper[4755]: I1006 09:06:39.933868 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.082624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.082675 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.082757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.082794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.082878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhrn2\" (UniqueName: \"kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.083046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key\") pod \"89a25b96-fda5-4f63-bf99-550ba2c00701\" (UID: \"89a25b96-fda5-4f63-bf99-550ba2c00701\") " Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.095531 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph" (OuterVolumeSpecName: "ceph") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.095595 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.095618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2" (OuterVolumeSpecName: "kube-api-access-hhrn2") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "kube-api-access-hhrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.141304 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.143485 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory" (OuterVolumeSpecName: "inventory") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.147770 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "89a25b96-fda5-4f63-bf99-550ba2c00701" (UID: "89a25b96-fda5-4f63-bf99-550ba2c00701"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.185761 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.186002 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.186085 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.186153 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.186214 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a25b96-fda5-4f63-bf99-550ba2c00701-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.186279 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhrn2\" (UniqueName: \"kubernetes.io/projected/89a25b96-fda5-4f63-bf99-550ba2c00701-kube-api-access-hhrn2\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.545284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" event={"ID":"89a25b96-fda5-4f63-bf99-550ba2c00701","Type":"ContainerDied","Data":"ba25fcb7a772c04e7b1d47246824a3d7e8c248cdd93a6994487cb8f56eec924e"} Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.545325 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba25fcb7a772c04e7b1d47246824a3d7e8c248cdd93a6994487cb8f56eec924e" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.545353 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.640795 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b"] Oct 06 09:06:40 crc kubenswrapper[4755]: E1006 09:06:40.641188 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a25b96-fda5-4f63-bf99-550ba2c00701" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.641205 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a25b96-fda5-4f63-bf99-550ba2c00701" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.641370 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a25b96-fda5-4f63-bf99-550ba2c00701" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.642023 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.644752 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.644809 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.645229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vb7qb" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.647177 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.647237 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.647286 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.647346 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.647862 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.648457 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.656004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b"] Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796912 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.796949 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.797012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.797060 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.797115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.797144 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pxcr\" (UniqueName: \"kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.797189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899034 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899254 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899285 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pxcr\" (UniqueName: \"kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899304 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899356 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.899382 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.901123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.905256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.906172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.906390 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.906535 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.906767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.907045 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.907533 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.907724 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.908265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.920269 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pxcr\" (UniqueName: \"kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:40 crc kubenswrapper[4755]: I1006 09:06:40.967554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:06:41 crc kubenswrapper[4755]: I1006 09:06:41.548372 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b"] Oct 06 09:06:41 crc kubenswrapper[4755]: I1006 09:06:41.554897 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:06:42 crc kubenswrapper[4755]: I1006 09:06:42.567258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" event={"ID":"6998032f-4cc5-4d30-8d2a-4c70731c20eb","Type":"ContainerStarted","Data":"516e66f899661b533a27a4e71af629b969f0940b7a666d6ec953dfeca91ede99"} Oct 06 09:06:42 crc kubenswrapper[4755]: I1006 09:06:42.567709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" event={"ID":"6998032f-4cc5-4d30-8d2a-4c70731c20eb","Type":"ContainerStarted","Data":"ee137f543e74bd5dafe61c7c9026bd1179b3d4e69edf32784516101ef79a9fbb"} Oct 06 09:06:42 crc kubenswrapper[4755]: I1006 09:06:42.597668 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" podStartSLOduration=1.897920458 podStartE2EDuration="2.597648105s" podCreationTimestamp="2025-10-06 09:06:40 +0000 UTC" firstStartedPulling="2025-10-06 09:06:41.554576472 +0000 UTC m=+2658.383891696" lastFinishedPulling="2025-10-06 09:06:42.254304119 +0000 UTC m=+2659.083619343" observedRunningTime="2025-10-06 09:06:42.596313642 +0000 UTC m=+2659.425628856" watchObservedRunningTime="2025-10-06 09:06:42.597648105 +0000 UTC m=+2659.426963339" Oct 06 09:06:48 crc kubenswrapper[4755]: I1006 09:06:48.878760 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:06:48 crc kubenswrapper[4755]: E1006 09:06:48.879759 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:07:01 crc kubenswrapper[4755]: I1006 09:07:01.878376 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:07:01 crc kubenswrapper[4755]: E1006 09:07:01.879133 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:07:15 crc kubenswrapper[4755]: I1006 09:07:15.880474 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:07:15 crc kubenswrapper[4755]: E1006 09:07:15.881681 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:07:30 crc kubenswrapper[4755]: I1006 09:07:30.880002 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:07:32 crc kubenswrapper[4755]: I1006 09:07:32.017609 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f"} Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.140238 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.143138 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.152755 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.314194 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.314513 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.314575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ftdd\" (UniqueName: \"kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.416026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.416073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.416126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ftdd\" (UniqueName: \"kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.416622 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.416685 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.437306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ftdd\" (UniqueName: \"kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd\") pod \"redhat-marketplace-tllgg\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.466027 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:11 crc kubenswrapper[4755]: I1006 09:08:11.906577 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:11 crc kubenswrapper[4755]: W1006 09:08:11.912918 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4c41c4_d9f4_4efe_9296_e1964e1dae48.slice/crio-69b55ae4e4fd5618e6b25ceef76baf6e18d7487db539789b35b6f53e1ca26c05 WatchSource:0}: Error finding container 69b55ae4e4fd5618e6b25ceef76baf6e18d7487db539789b35b6f53e1ca26c05: Status 404 returned error can't find the container with id 69b55ae4e4fd5618e6b25ceef76baf6e18d7487db539789b35b6f53e1ca26c05 Oct 06 09:08:12 crc kubenswrapper[4755]: I1006 09:08:12.365829 4755 generic.go:334] "Generic (PLEG): container finished" podID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerID="4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2" exitCode=0 Oct 06 09:08:12 crc kubenswrapper[4755]: I1006 09:08:12.365874 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerDied","Data":"4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2"} Oct 06 09:08:12 crc kubenswrapper[4755]: I1006 09:08:12.366148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerStarted","Data":"69b55ae4e4fd5618e6b25ceef76baf6e18d7487db539789b35b6f53e1ca26c05"} Oct 06 09:08:13 crc kubenswrapper[4755]: I1006 09:08:13.382335 4755 generic.go:334] "Generic (PLEG): container finished" podID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerID="53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f" exitCode=0 Oct 06 09:08:13 crc kubenswrapper[4755]: I1006 09:08:13.382401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerDied","Data":"53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f"} Oct 06 09:08:14 crc kubenswrapper[4755]: I1006 09:08:14.393084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerStarted","Data":"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821"} Oct 06 09:08:14 crc kubenswrapper[4755]: I1006 09:08:14.419666 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tllgg" podStartSLOduration=1.984408507 podStartE2EDuration="3.419646315s" podCreationTimestamp="2025-10-06 09:08:11 +0000 UTC" firstStartedPulling="2025-10-06 09:08:12.367297466 +0000 UTC m=+2749.196612680" lastFinishedPulling="2025-10-06 09:08:13.802535264 +0000 UTC m=+2750.631850488" observedRunningTime="2025-10-06 09:08:14.415474434 +0000 UTC m=+2751.244789668" watchObservedRunningTime="2025-10-06 09:08:14.419646315 +0000 UTC m=+2751.248961539" Oct 06 09:08:21 crc kubenswrapper[4755]: I1006 09:08:21.466628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:21 crc kubenswrapper[4755]: I1006 09:08:21.467202 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:21 crc kubenswrapper[4755]: I1006 09:08:21.511850 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:22 crc kubenswrapper[4755]: I1006 09:08:22.507169 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:22 crc kubenswrapper[4755]: I1006 09:08:22.560963 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:24 crc kubenswrapper[4755]: I1006 09:08:24.478666 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tllgg" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="registry-server" containerID="cri-o://0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821" gracePeriod=2 Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.003888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.085492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities\") pod \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.085556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ftdd\" (UniqueName: \"kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd\") pod \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.085629 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content\") pod \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\" (UID: \"4d4c41c4-d9f4-4efe-9296-e1964e1dae48\") " Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.086398 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities" (OuterVolumeSpecName: "utilities") pod "4d4c41c4-d9f4-4efe-9296-e1964e1dae48" (UID: "4d4c41c4-d9f4-4efe-9296-e1964e1dae48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.089127 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.098854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd" (OuterVolumeSpecName: "kube-api-access-7ftdd") pod "4d4c41c4-d9f4-4efe-9296-e1964e1dae48" (UID: "4d4c41c4-d9f4-4efe-9296-e1964e1dae48"). InnerVolumeSpecName "kube-api-access-7ftdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.100260 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d4c41c4-d9f4-4efe-9296-e1964e1dae48" (UID: "4d4c41c4-d9f4-4efe-9296-e1964e1dae48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.190887 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.190936 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ftdd\" (UniqueName: \"kubernetes.io/projected/4d4c41c4-d9f4-4efe-9296-e1964e1dae48-kube-api-access-7ftdd\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.488686 4755 generic.go:334] "Generic (PLEG): container finished" podID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerID="0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821" exitCode=0 Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.488736 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerDied","Data":"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821"} Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.488768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tllgg" event={"ID":"4d4c41c4-d9f4-4efe-9296-e1964e1dae48","Type":"ContainerDied","Data":"69b55ae4e4fd5618e6b25ceef76baf6e18d7487db539789b35b6f53e1ca26c05"} Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.488776 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tllgg" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.488785 4755 scope.go:117] "RemoveContainer" containerID="0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.526339 4755 scope.go:117] "RemoveContainer" containerID="53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.531048 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.539479 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tllgg"] Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.547471 4755 scope.go:117] "RemoveContainer" containerID="4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.583187 4755 scope.go:117] "RemoveContainer" containerID="0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821" Oct 06 09:08:25 crc kubenswrapper[4755]: E1006 09:08:25.583701 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821\": container with ID starting with 0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821 not found: ID does not exist" containerID="0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.583740 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821"} err="failed to get container status \"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821\": rpc error: code = NotFound desc = could not find container \"0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821\": container with ID starting with 0861d050a469905bc56aa1a0483a02cf95406f3279e03969d5ee69c77da14821 not found: ID does not exist" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.583759 4755 scope.go:117] "RemoveContainer" containerID="53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f" Oct 06 09:08:25 crc kubenswrapper[4755]: E1006 09:08:25.584132 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f\": container with ID starting with 53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f not found: ID does not exist" containerID="53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.584199 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f"} err="failed to get container status \"53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f\": rpc error: code = NotFound desc = could not find container \"53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f\": container with ID starting with 53d0065e4e5c1e394d433dbc4944ff3e03eb962036320f682a95b62d9cc5b10f not found: ID does not exist" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.584229 4755 scope.go:117] "RemoveContainer" containerID="4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2" Oct 06 09:08:25 crc kubenswrapper[4755]: E1006 09:08:25.584503 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2\": container with ID starting with 4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2 not found: ID does not exist" containerID="4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.584530 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2"} err="failed to get container status \"4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2\": rpc error: code = NotFound desc = could not find container \"4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2\": container with ID starting with 4d122edf04e88b3e2e381371754357a7895d82bc9e988fce140309e82975f8a2 not found: ID does not exist" Oct 06 09:08:25 crc kubenswrapper[4755]: I1006 09:08:25.888163 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" path="/var/lib/kubelet/pods/4d4c41c4-d9f4-4efe-9296-e1964e1dae48/volumes" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.078074 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:39 crc kubenswrapper[4755]: E1006 09:08:39.079312 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="registry-server" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.079339 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="registry-server" Oct 06 09:08:39 crc kubenswrapper[4755]: E1006 09:08:39.079361 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="extract-utilities" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.079370 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="extract-utilities" Oct 06 09:08:39 crc kubenswrapper[4755]: E1006 09:08:39.079398 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="extract-content" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.079406 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="extract-content" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.079646 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4c41c4-d9f4-4efe-9296-e1964e1dae48" containerName="registry-server" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.081434 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.102918 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.255839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.257057 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.257449 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jsgm\" (UniqueName: \"kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.358795 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.358928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.359013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jsgm\" (UniqueName: \"kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.359387 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.359421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.398677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jsgm\" (UniqueName: \"kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm\") pod \"certified-operators-8mwk5\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.418974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:39 crc kubenswrapper[4755]: W1006 09:08:39.931012 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897d3b9a_66fd_4244_b6d4_8293ab9003cb.slice/crio-59c9e225a983bbcf5c887b2e9d1b529b4461d390ecaac57210b28122e6b7b357 WatchSource:0}: Error finding container 59c9e225a983bbcf5c887b2e9d1b529b4461d390ecaac57210b28122e6b7b357: Status 404 returned error can't find the container with id 59c9e225a983bbcf5c887b2e9d1b529b4461d390ecaac57210b28122e6b7b357 Oct 06 09:08:39 crc kubenswrapper[4755]: I1006 09:08:39.932391 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:40 crc kubenswrapper[4755]: I1006 09:08:40.635647 4755 generic.go:334] "Generic (PLEG): container finished" podID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerID="0686f2b630552f313164e94c46f42b093771f487259996cff3d99b803643da75" exitCode=0 Oct 06 09:08:40 crc kubenswrapper[4755]: I1006 09:08:40.636047 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerDied","Data":"0686f2b630552f313164e94c46f42b093771f487259996cff3d99b803643da75"} Oct 06 09:08:40 crc kubenswrapper[4755]: I1006 09:08:40.636082 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerStarted","Data":"59c9e225a983bbcf5c887b2e9d1b529b4461d390ecaac57210b28122e6b7b357"} Oct 06 09:08:41 crc kubenswrapper[4755]: I1006 09:08:41.648985 4755 generic.go:334] "Generic (PLEG): container finished" podID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerID="210bfe78da29bc0889333c7dcc7f11851163f924d5742ec1793f99b52530a6c7" exitCode=0 Oct 06 09:08:41 crc kubenswrapper[4755]: I1006 09:08:41.649074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerDied","Data":"210bfe78da29bc0889333c7dcc7f11851163f924d5742ec1793f99b52530a6c7"} Oct 06 09:08:42 crc kubenswrapper[4755]: I1006 09:08:42.665912 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerStarted","Data":"376f91acf9b63b06726e56e053e53b2d969e209f76f0c7823e35a4cc5875bf94"} Oct 06 09:08:42 crc kubenswrapper[4755]: I1006 09:08:42.705783 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mwk5" podStartSLOduration=2.28268469 podStartE2EDuration="3.705758275s" podCreationTimestamp="2025-10-06 09:08:39 +0000 UTC" firstStartedPulling="2025-10-06 09:08:40.638201759 +0000 UTC m=+2777.467516973" lastFinishedPulling="2025-10-06 09:08:42.061275344 +0000 UTC m=+2778.890590558" observedRunningTime="2025-10-06 09:08:42.679131892 +0000 UTC m=+2779.508447106" watchObservedRunningTime="2025-10-06 09:08:42.705758275 +0000 UTC m=+2779.535073489" Oct 06 09:08:49 crc kubenswrapper[4755]: I1006 09:08:49.420609 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:49 crc kubenswrapper[4755]: I1006 09:08:49.421295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:49 crc kubenswrapper[4755]: I1006 09:08:49.467283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:49 crc kubenswrapper[4755]: I1006 09:08:49.783356 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:49 crc kubenswrapper[4755]: I1006 09:08:49.826153 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:51 crc kubenswrapper[4755]: I1006 09:08:51.759838 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mwk5" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="registry-server" containerID="cri-o://376f91acf9b63b06726e56e053e53b2d969e209f76f0c7823e35a4cc5875bf94" gracePeriod=2 Oct 06 09:08:52 crc kubenswrapper[4755]: I1006 09:08:52.782432 4755 generic.go:334] "Generic (PLEG): container finished" podID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerID="376f91acf9b63b06726e56e053e53b2d969e209f76f0c7823e35a4cc5875bf94" exitCode=0 Oct 06 09:08:52 crc kubenswrapper[4755]: I1006 09:08:52.782644 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerDied","Data":"376f91acf9b63b06726e56e053e53b2d969e209f76f0c7823e35a4cc5875bf94"} Oct 06 09:08:52 crc kubenswrapper[4755]: I1006 09:08:52.875489 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.019795 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content\") pod \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.020375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jsgm\" (UniqueName: \"kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm\") pod \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.020493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities\") pod \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\" (UID: \"897d3b9a-66fd-4244-b6d4-8293ab9003cb\") " Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.021340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities" (OuterVolumeSpecName: "utilities") pod "897d3b9a-66fd-4244-b6d4-8293ab9003cb" (UID: "897d3b9a-66fd-4244-b6d4-8293ab9003cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.027159 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm" (OuterVolumeSpecName: "kube-api-access-9jsgm") pod "897d3b9a-66fd-4244-b6d4-8293ab9003cb" (UID: "897d3b9a-66fd-4244-b6d4-8293ab9003cb"). InnerVolumeSpecName "kube-api-access-9jsgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.067546 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "897d3b9a-66fd-4244-b6d4-8293ab9003cb" (UID: "897d3b9a-66fd-4244-b6d4-8293ab9003cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.123063 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jsgm\" (UniqueName: \"kubernetes.io/projected/897d3b9a-66fd-4244-b6d4-8293ab9003cb-kube-api-access-9jsgm\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.123107 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.123122 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/897d3b9a-66fd-4244-b6d4-8293ab9003cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.796107 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mwk5" event={"ID":"897d3b9a-66fd-4244-b6d4-8293ab9003cb","Type":"ContainerDied","Data":"59c9e225a983bbcf5c887b2e9d1b529b4461d390ecaac57210b28122e6b7b357"} Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.796181 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mwk5" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.796198 4755 scope.go:117] "RemoveContainer" containerID="376f91acf9b63b06726e56e053e53b2d969e209f76f0c7823e35a4cc5875bf94" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.828408 4755 scope.go:117] "RemoveContainer" containerID="210bfe78da29bc0889333c7dcc7f11851163f924d5742ec1793f99b52530a6c7" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.847832 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.852640 4755 scope.go:117] "RemoveContainer" containerID="0686f2b630552f313164e94c46f42b093771f487259996cff3d99b803643da75" Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.866151 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mwk5"] Oct 06 09:08:53 crc kubenswrapper[4755]: I1006 09:08:53.891281 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" path="/var/lib/kubelet/pods/897d3b9a-66fd-4244-b6d4-8293ab9003cb/volumes" Oct 06 09:09:48 crc kubenswrapper[4755]: I1006 09:09:48.912831 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:09:48 crc kubenswrapper[4755]: I1006 09:09:48.913904 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:10:18 crc kubenswrapper[4755]: I1006 09:10:18.590666 4755 generic.go:334] "Generic (PLEG): container finished" podID="6998032f-4cc5-4d30-8d2a-4c70731c20eb" containerID="516e66f899661b533a27a4e71af629b969f0940b7a666d6ec953dfeca91ede99" exitCode=0 Oct 06 09:10:18 crc kubenswrapper[4755]: I1006 09:10:18.590725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" event={"ID":"6998032f-4cc5-4d30-8d2a-4c70731c20eb","Type":"ContainerDied","Data":"516e66f899661b533a27a4e71af629b969f0940b7a666d6ec953dfeca91ede99"} Oct 06 09:10:18 crc kubenswrapper[4755]: I1006 09:10:18.912231 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:10:18 crc kubenswrapper[4755]: I1006 09:10:18.912279 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.007405 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.177540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.177756 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.177821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.177912 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178108 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178208 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178360 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178397 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pxcr\" (UniqueName: \"kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.178531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0\") pod \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\" (UID: \"6998032f-4cc5-4d30-8d2a-4c70731c20eb\") " Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.186700 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph" (OuterVolumeSpecName: "ceph") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.186940 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.187798 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr" (OuterVolumeSpecName: "kube-api-access-2pxcr") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "kube-api-access-2pxcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.204947 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.209024 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.210443 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.215536 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.217170 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.230598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory" (OuterVolumeSpecName: "inventory") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.231044 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.241925 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6998032f-4cc5-4d30-8d2a-4c70731c20eb" (UID: "6998032f-4cc5-4d30-8d2a-4c70731c20eb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284283 4755 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284351 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284383 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284407 4755 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284433 4755 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284461 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284480 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284500 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284517 4755 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/6998032f-4cc5-4d30-8d2a-4c70731c20eb-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284535 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pxcr\" (UniqueName: \"kubernetes.io/projected/6998032f-4cc5-4d30-8d2a-4c70731c20eb-kube-api-access-2pxcr\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.284552 4755 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6998032f-4cc5-4d30-8d2a-4c70731c20eb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.618538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" event={"ID":"6998032f-4cc5-4d30-8d2a-4c70731c20eb","Type":"ContainerDied","Data":"ee137f543e74bd5dafe61c7c9026bd1179b3d4e69edf32784516101ef79a9fbb"} Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.619113 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee137f543e74bd5dafe61c7c9026bd1179b3d4e69edf32784516101ef79a9fbb" Oct 06 09:10:20 crc kubenswrapper[4755]: I1006 09:10:20.618647 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.347199 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:28 crc kubenswrapper[4755]: E1006 09:10:28.348542 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6998032f-4cc5-4d30-8d2a-4c70731c20eb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.349968 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6998032f-4cc5-4d30-8d2a-4c70731c20eb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 06 09:10:28 crc kubenswrapper[4755]: E1006 09:10:28.349997 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="extract-utilities" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.350005 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="extract-utilities" Oct 06 09:10:28 crc kubenswrapper[4755]: E1006 09:10:28.350048 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="extract-content" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.350055 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="extract-content" Oct 06 09:10:28 crc kubenswrapper[4755]: E1006 09:10:28.350066 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="registry-server" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.350072 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="registry-server" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.350316 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6998032f-4cc5-4d30-8d2a-4c70731c20eb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.350337 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="897d3b9a-66fd-4244-b6d4-8293ab9003cb" containerName="registry-server" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.356767 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.369159 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.369236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.369358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbvm\" (UniqueName: \"kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.372540 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.471183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.471241 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.471317 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbvm\" (UniqueName: \"kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.472070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.472465 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.498747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbvm\" (UniqueName: \"kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm\") pod \"community-operators-p95h5\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.699745 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:28 crc kubenswrapper[4755]: I1006 09:10:28.997371 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:29 crc kubenswrapper[4755]: I1006 09:10:29.698030 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerID="11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21" exitCode=0 Oct 06 09:10:29 crc kubenswrapper[4755]: I1006 09:10:29.698078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerDied","Data":"11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21"} Oct 06 09:10:29 crc kubenswrapper[4755]: I1006 09:10:29.699685 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerStarted","Data":"0d06b84b7bd1e1e1fa9e7e7653dca4d8788fac49ad0e0e0a06d288c88d029cdd"} Oct 06 09:10:30 crc kubenswrapper[4755]: I1006 09:10:30.709126 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerStarted","Data":"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2"} Oct 06 09:10:31 crc kubenswrapper[4755]: I1006 09:10:31.718831 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerID="25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2" exitCode=0 Oct 06 09:10:31 crc kubenswrapper[4755]: I1006 09:10:31.718927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerDied","Data":"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2"} Oct 06 09:10:32 crc kubenswrapper[4755]: I1006 09:10:32.732040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerStarted","Data":"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f"} Oct 06 09:10:32 crc kubenswrapper[4755]: I1006 09:10:32.763788 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p95h5" podStartSLOduration=2.088708679 podStartE2EDuration="4.763763301s" podCreationTimestamp="2025-10-06 09:10:28 +0000 UTC" firstStartedPulling="2025-10-06 09:10:29.699596153 +0000 UTC m=+2886.528911367" lastFinishedPulling="2025-10-06 09:10:32.374650775 +0000 UTC m=+2889.203965989" observedRunningTime="2025-10-06 09:10:32.756085613 +0000 UTC m=+2889.585400837" watchObservedRunningTime="2025-10-06 09:10:32.763763301 +0000 UTC m=+2889.593078515" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.658253 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.660932 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.663465 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.676300 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.687884 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.770495 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.772633 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.789887 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792343 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792445 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792481 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792615 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjcn\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-kube-api-access-pvjcn\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792953 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.792993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.793031 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.793068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.793105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.793133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-run\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.793171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.802040 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-run\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-sys\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895601 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-dev\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895637 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895698 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjcn\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-kube-api-access-pvjcn\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895857 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-ceph\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lztbz\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-kube-api-access-lztbz\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.895993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-lib-modules\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-run\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896201 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-run\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896268 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-scripts\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896694 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896970 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.897172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.897262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.896781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9a5baec-e335-4430-87ff-df995cc28434-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.905766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.909871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.910486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.911112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a5baec-e335-4430-87ff-df995cc28434-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.927047 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.933206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjcn\" (UniqueName: \"kubernetes.io/projected/e9a5baec-e335-4430-87ff-df995cc28434-kube-api-access-pvjcn\") pod \"cinder-volume-volume1-0\" (UID: \"e9a5baec-e335-4430-87ff-df995cc28434\") " pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:34 crc kubenswrapper[4755]: I1006 09:10:34.980476 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004654 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-run\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-sys\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-dev\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004906 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004936 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.004994 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.005018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-ceph\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.005049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lztbz\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-kube-api-access-lztbz\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.005072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.005094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-lib-modules\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.005144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-scripts\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.006374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.010743 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-run\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.010854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.010907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.011591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-lib-modules\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.011694 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-sys\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.011832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.011923 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-dev\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.013076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.013133 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/708a75eb-b436-40c0-b25c-8935f399cb4a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.028761 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.032393 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.032512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-ceph\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.033973 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-scripts\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.036052 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lztbz\" (UniqueName: \"kubernetes.io/projected/708a75eb-b436-40c0-b25c-8935f399cb4a-kube-api-access-lztbz\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.046487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/708a75eb-b436-40c0-b25c-8935f399cb4a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"708a75eb-b436-40c0-b25c-8935f399cb4a\") " pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.091000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.532198 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-r74gz"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.543760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r74gz" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.635595 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.638154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27pqd\" (UniqueName: \"kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd\") pod \"manila-db-create-r74gz\" (UID: \"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96\") " pod="openstack/manila-db-create-r74gz" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.645988 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.651223 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.651403 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.651542 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.659619 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-psxww" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.673266 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-r74gz"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.684067 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.699630 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.709964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.711282 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.711612 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-j5wvf" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.711855 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.722030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.733046 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.743803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.743913 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.743939 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.743967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.743995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.744042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.744071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.744110 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.744190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ftf\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.744227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27pqd\" (UniqueName: \"kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd\") pod \"manila-db-create-r74gz\" (UID: \"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96\") " pod="openstack/manila-db-create-r74gz" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.759658 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.761892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.765213 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.765469 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.773209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27pqd\" (UniqueName: \"kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd\") pod \"manila-db-create-r74gz\" (UID: \"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96\") " pod="openstack/manila-db-create-r74gz" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.774859 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.791453 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: E1006 09:10:35.792236 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-w4ftf logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="756b4ec9-3c99-437a-a8af-5a114fb1828e" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.814025 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.817065 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.849975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.850220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.850251 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9kt\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860261 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860328 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbh2\" (UniqueName: \"kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860423 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860526 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860584 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860662 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860694 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ftf\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.860906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.861419 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.861654 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.861794 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.863702 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.867692 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.868090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.868543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: E1006 09:10:35.868607 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-4f9kt logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="08446d32-758b-4984-bdb9-6ede431279b2" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.869182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.870436 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.870971 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.884651 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ftf\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.900942 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r74gz" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.947421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.949601 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.963137 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.963618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.963744 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.963901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.963987 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.964998 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965051 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965366 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smn7b\" (UniqueName: \"kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965835 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9kt\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965892 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.965940 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbh2\" (UniqueName: \"kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.966551 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.967309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.967605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.968180 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.968703 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.969036 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.969214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.978727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.983067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.984058 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.986143 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.988164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.989349 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9kt\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:35 crc kubenswrapper[4755]: I1006 09:10:35.991912 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbh2\" (UniqueName: \"kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2\") pod \"horizon-58d8996c-r5v5j\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.027644 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.034794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.042733 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.068028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.068408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.068601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smn7b\" (UniqueName: \"kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.068642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.068689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.069211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.070274 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.076315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.076734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.098498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smn7b\" (UniqueName: \"kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b\") pod \"horizon-85c75dc44f-w5tzr\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.163677 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.451382 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-r74gz"] Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.591539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:10:36 crc kubenswrapper[4755]: W1006 09:10:36.635601 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f52a11_abe0_44d2_b543_80fd120a6299.slice/crio-f6b9641bb8f0ea06103925627624d904a3504a2310b201e3b70794d7120649c6 WatchSource:0}: Error finding container f6b9641bb8f0ea06103925627624d904a3504a2310b201e3b70794d7120649c6: Status 404 returned error can't find the container with id f6b9641bb8f0ea06103925627624d904a3504a2310b201e3b70794d7120649c6 Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.729108 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.782297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"708a75eb-b436-40c0-b25c-8935f399cb4a","Type":"ContainerStarted","Data":"d4f335597b275beb513a02d51cc7e3bff21177f4e70d60c56f1dab009a99d2e3"} Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.785078 4755 generic.go:334] "Generic (PLEG): container finished" podID="95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" containerID="e778b1f2d5ce4896aa5c58cc165202c69b8f1b127620513830d5195477c025be" exitCode=0 Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.785168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r74gz" event={"ID":"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96","Type":"ContainerDied","Data":"e778b1f2d5ce4896aa5c58cc165202c69b8f1b127620513830d5195477c025be"} Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.785205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r74gz" event={"ID":"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96","Type":"ContainerStarted","Data":"fb5f19c7e86709af7364992cfd866ec3238df08ba053e2036f6432608c992b8f"} Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.786556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e9a5baec-e335-4430-87ff-df995cc28434","Type":"ContainerStarted","Data":"2fcf1722ea52d294a01f1ccfb9fb99fcbe4fbc1b4e233c4009fbd48bf21304f8"} Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.788030 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.791651 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerStarted","Data":"f6b9641bb8f0ea06103925627624d904a3504a2310b201e3b70794d7120649c6"} Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.791725 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:36 crc kubenswrapper[4755]: W1006 09:10:36.794076 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ffa4bf1_4928_4e5c_acf2_1ff7ffaec08b.slice/crio-bc56fafe4d725aa171975511cb0de6360f5b52bfa0aca109a5154e4bec800a9a WatchSource:0}: Error finding container bc56fafe4d725aa171975511cb0de6360f5b52bfa0aca109a5154e4bec800a9a: Status 404 returned error can't find the container with id bc56fafe4d725aa171975511cb0de6360f5b52bfa0aca109a5154e4bec800a9a Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.810223 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.839695 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.885927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886005 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886033 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886100 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886145 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4ftf\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886285 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886309 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886328 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886420 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886445 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9kt\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886486 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886511 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run\") pod \"756b4ec9-3c99-437a-a8af-5a114fb1828e\" (UID: \"756b4ec9-3c99-437a-a8af-5a114fb1828e\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.886615 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph\") pod \"08446d32-758b-4984-bdb9-6ede431279b2\" (UID: \"08446d32-758b-4984-bdb9-6ede431279b2\") " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.889849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.893965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs" (OuterVolumeSpecName: "logs") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.894175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs" (OuterVolumeSpecName: "logs") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.894246 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.903884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.908942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph" (OuterVolumeSpecName: "ceph") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.910169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data" (OuterVolumeSpecName: "config-data") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.911292 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.911382 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.912117 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts" (OuterVolumeSpecName: "scripts") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.913012 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.915853 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt" (OuterVolumeSpecName: "kube-api-access-4f9kt") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "kube-api-access-4f9kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.918954 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf" (OuterVolumeSpecName: "kube-api-access-w4ftf") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "kube-api-access-w4ftf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.921704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data" (OuterVolumeSpecName: "config-data") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.937315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts" (OuterVolumeSpecName: "scripts") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.938791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08446d32-758b-4984-bdb9-6ede431279b2" (UID: "08446d32-758b-4984-bdb9-6ede431279b2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.950710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.955027 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph" (OuterVolumeSpecName: "ceph") pod "756b4ec9-3c99-437a-a8af-5a114fb1828e" (UID: "756b4ec9-3c99-437a-a8af-5a114fb1828e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.990604 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991149 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991618 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991637 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991650 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991685 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991695 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f9kt\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-kube-api-access-4f9kt\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991705 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991714 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991723 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/756b4ec9-3c99-437a-a8af-5a114fb1828e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991732 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/08446d32-758b-4984-bdb9-6ede431279b2-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991762 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08446d32-758b-4984-bdb9-6ede431279b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991780 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991790 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991800 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991809 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08446d32-758b-4984-bdb9-6ede431279b2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991823 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b4ec9-3c99-437a-a8af-5a114fb1828e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:36 crc kubenswrapper[4755]: I1006 09:10:36.991843 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4ftf\" (UniqueName: \"kubernetes.io/projected/756b4ec9-3c99-437a-a8af-5a114fb1828e-kube-api-access-w4ftf\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.013641 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.020233 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.113294 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.113333 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.817950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerStarted","Data":"bc56fafe4d725aa171975511cb0de6360f5b52bfa0aca109a5154e4bec800a9a"} Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.823891 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"708a75eb-b436-40c0-b25c-8935f399cb4a","Type":"ContainerStarted","Data":"196da9477aa46153f87390ae99ef0183cf3068c3c05d3b5192bd69e4e528f132"} Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.823920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"708a75eb-b436-40c0-b25c-8935f399cb4a","Type":"ContainerStarted","Data":"f3382621b31c82a8c910a25a9428b4b20ba647182e18bcb7fca3177bdaa45f6c"} Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.825959 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.830025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e9a5baec-e335-4430-87ff-df995cc28434","Type":"ContainerStarted","Data":"39adfbc757a7559a7e0c1bf0d2bbd774822d943f88074c5be75db495c7edbba0"} Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.830068 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e9a5baec-e335-4430-87ff-df995cc28434","Type":"ContainerStarted","Data":"39df52f1a10479d4e995ef61f61cbddce7f8d2ff145144bbbb1cc8740a3016ad"} Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.830117 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.861158 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.8380727070000002 podStartE2EDuration="3.861129915s" podCreationTimestamp="2025-10-06 09:10:34 +0000 UTC" firstStartedPulling="2025-10-06 09:10:36.043060605 +0000 UTC m=+2892.872375819" lastFinishedPulling="2025-10-06 09:10:37.066117803 +0000 UTC m=+2893.895433027" observedRunningTime="2025-10-06 09:10:37.850924754 +0000 UTC m=+2894.680239968" watchObservedRunningTime="2025-10-06 09:10:37.861129915 +0000 UTC m=+2894.690445119" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.883791 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.009829682 podStartE2EDuration="3.88376026s" podCreationTimestamp="2025-10-06 09:10:34 +0000 UTC" firstStartedPulling="2025-10-06 09:10:35.809320572 +0000 UTC m=+2892.638635786" lastFinishedPulling="2025-10-06 09:10:36.68325114 +0000 UTC m=+2893.512566364" observedRunningTime="2025-10-06 09:10:37.871776926 +0000 UTC m=+2894.701092140" watchObservedRunningTime="2025-10-06 09:10:37.88376026 +0000 UTC m=+2894.713075474" Oct 06 09:10:37 crc kubenswrapper[4755]: I1006 09:10:37.984611 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.087922 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.124709 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.126815 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.132010 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.132339 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.132483 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.132734 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-psxww" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.245989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhl62\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246249 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.246370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.248833 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.319537 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348169 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348228 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348264 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhl62\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348340 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.348369 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.350552 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.352798 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.360118 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.360900 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.362320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.374648 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.378834 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.379392 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.395370 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhl62\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.395523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.431999 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.433855 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.437597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.457994 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.461079 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.461397 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.488990 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.518527 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.519036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.520137 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.527535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.576873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.577891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.577969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.578037 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.579443 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.580117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.581006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwc22\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.581075 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.581334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.599736 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r74gz" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.612630 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.629368 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.682967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwc22\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683219 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683299 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683318 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bbr\" (UniqueName: \"kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683335 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683378 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683426 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.683688 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.693348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.703332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.703893 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.712921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.713166 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.713191 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.731379 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwc22\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.731446 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.734166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.744745 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.745270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.752666 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: E1006 09:10:38.755160 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle glance public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="901c87c8-63c8-49d0-b9dc-1c1729ae2e91" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.761648 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.777027 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5797d74dbd-6v4nj"] Oct 06 09:10:38 crc kubenswrapper[4755]: E1006 09:10:38.777802 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" containerName="mariadb-database-create" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.777828 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" containerName="mariadb-database-create" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.778152 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" containerName="mariadb-database-create" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.779954 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.787152 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5797d74dbd-6v4nj"] Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.790211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27pqd\" (UniqueName: \"kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd\") pod \"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96\" (UID: \"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96\") " Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.790936 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.790975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.791069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bbr\" (UniqueName: \"kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.791099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.791156 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.791205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.791257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.795843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.798609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.802338 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd" (OuterVolumeSpecName: "kube-api-access-27pqd") pod "95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" (UID: "95bc5e36-48f0-46d8-a2d6-ef94e52c7b96"). InnerVolumeSpecName "kube-api-access-27pqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.808123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.809322 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.823131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.826178 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.829162 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bbr\" (UniqueName: \"kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr\") pod \"horizon-77dcf6c7d-x7gnh\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.830037 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.872260 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r74gz" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.873696 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.875120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r74gz" event={"ID":"95bc5e36-48f0-46d8-a2d6-ef94e52c7b96","Type":"ContainerDied","Data":"fb5f19c7e86709af7364992cfd866ec3238df08ba053e2036f6432608c992b8f"} Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.875250 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb5f19c7e86709af7364992cfd866ec3238df08ba053e2036f6432608c992b8f" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.895218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.900927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-secret-key\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901025 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-combined-ca-bundle\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-scripts\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3583b65a-632c-4988-81fa-d1ee08e8f258-logs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-tls-certs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjgf6\" (UniqueName: \"kubernetes.io/projected/3583b65a-632c-4988-81fa-d1ee08e8f258-kube-api-access-wjgf6\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-config-data\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.901651 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27pqd\" (UniqueName: \"kubernetes.io/projected/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96-kube-api-access-27pqd\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.927826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:38 crc kubenswrapper[4755]: I1006 09:10:38.950352 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003030 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003198 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003228 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwc22\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003359 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003424 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003451 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003605 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs" (OuterVolumeSpecName: "logs") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003644 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.003759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run\") pod \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\" (UID: \"901c87c8-63c8-49d0-b9dc-1c1729ae2e91\") " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-config-data\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-secret-key\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004270 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-combined-ca-bundle\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-scripts\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3583b65a-632c-4988-81fa-d1ee08e8f258-logs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004425 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-tls-certs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjgf6\" (UniqueName: \"kubernetes.io/projected/3583b65a-632c-4988-81fa-d1ee08e8f258-kube-api-access-wjgf6\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.004665 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.005710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-scripts\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.006049 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3583b65a-632c-4988-81fa-d1ee08e8f258-logs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.012536 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.013498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3583b65a-632c-4988-81fa-d1ee08e8f258-config-data\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.013744 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22" (OuterVolumeSpecName: "kube-api-access-mwc22") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "kube-api-access-mwc22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.014289 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.017420 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph" (OuterVolumeSpecName: "ceph") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.021143 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts" (OuterVolumeSpecName: "scripts") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.025820 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data" (OuterVolumeSpecName: "config-data") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.027846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.033865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-combined-ca-bundle\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.034679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjgf6\" (UniqueName: \"kubernetes.io/projected/3583b65a-632c-4988-81fa-d1ee08e8f258-kube-api-access-wjgf6\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.035822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-secret-key\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.056977 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "901c87c8-63c8-49d0-b9dc-1c1729ae2e91" (UID: "901c87c8-63c8-49d0-b9dc-1c1729ae2e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.057408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3583b65a-632c-4988-81fa-d1ee08e8f258-horizon-tls-certs\") pod \"horizon-5797d74dbd-6v4nj\" (UID: \"3583b65a-632c-4988-81fa-d1ee08e8f258\") " pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.101717 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.106497 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.106967 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107004 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107014 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107025 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107033 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107042 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.107051 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwc22\" (UniqueName: \"kubernetes.io/projected/901c87c8-63c8-49d0-b9dc-1c1729ae2e91-kube-api-access-mwc22\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.128500 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.131134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.210015 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.343983 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.572298 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:10:39 crc kubenswrapper[4755]: W1006 09:10:39.593722 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e61052_105b_4bd0_8056_8a29dec9fcfe.slice/crio-7a65f606511138dddf084201ea24a9eca35b6fc301c117541b4aefa75bb56ae0 WatchSource:0}: Error finding container 7a65f606511138dddf084201ea24a9eca35b6fc301c117541b4aefa75bb56ae0: Status 404 returned error can't find the container with id 7a65f606511138dddf084201ea24a9eca35b6fc301c117541b4aefa75bb56ae0 Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.787138 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5797d74dbd-6v4nj"] Oct 06 09:10:39 crc kubenswrapper[4755]: W1006 09:10:39.806195 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3583b65a_632c_4988_81fa_d1ee08e8f258.slice/crio-107b393a8a7313e0da6558c82ffa62a95d836e5bf7c3b118d263b688a617ea2e WatchSource:0}: Error finding container 107b393a8a7313e0da6558c82ffa62a95d836e5bf7c3b118d263b688a617ea2e: Status 404 returned error can't find the container with id 107b393a8a7313e0da6558c82ffa62a95d836e5bf7c3b118d263b688a617ea2e Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.900438 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08446d32-758b-4984-bdb9-6ede431279b2" path="/var/lib/kubelet/pods/08446d32-758b-4984-bdb9-6ede431279b2/volumes" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.900859 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.901030 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756b4ec9-3c99-437a-a8af-5a114fb1828e" path="/var/lib/kubelet/pods/756b4ec9-3c99-437a-a8af-5a114fb1828e/volumes" Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.901783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerStarted","Data":"7a65f606511138dddf084201ea24a9eca35b6fc301c117541b4aefa75bb56ae0"} Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.901820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerStarted","Data":"91485a3987e0a0cb2e4d38fc1080368059a68930b5bee704111a2fb808882f7e"} Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.901837 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5797d74dbd-6v4nj" event={"ID":"3583b65a-632c-4988-81fa-d1ee08e8f258","Type":"ContainerStarted","Data":"107b393a8a7313e0da6558c82ffa62a95d836e5bf7c3b118d263b688a617ea2e"} Oct 06 09:10:39 crc kubenswrapper[4755]: I1006 09:10:39.981345 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.037677 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.047126 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.059377 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.061283 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.063978 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.068542 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.073829 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.095842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-ceph\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-logs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236286 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236371 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236440 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-config-data\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-scripts\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236480 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5tr\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-kube-api-access-cd5tr\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.236530 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.344672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-ceph\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-logs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345502 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-config-data\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-scripts\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5tr\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-kube-api-access-cd5tr\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345629 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.345655 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.346028 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.348669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-logs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.354865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-ceph\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.355940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-scripts\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.357880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.358377 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/594fa22c-967e-4e3f-aa38-609978425183-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.364932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-config-data\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.372071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594fa22c-967e-4e3f-aa38-609978425183-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.397494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5tr\" (UniqueName: \"kubernetes.io/projected/594fa22c-967e-4e3f-aa38-609978425183-kube-api-access-cd5tr\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.411913 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"594fa22c-967e-4e3f-aa38-609978425183\") " pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.695876 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.941421 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p95h5" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="registry-server" containerID="cri-o://3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f" gracePeriod=2 Oct 06 09:10:40 crc kubenswrapper[4755]: I1006 09:10:40.942022 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerStarted","Data":"76483577e1734d2a556949998130be6d9eea8b6ccc077c813eda32aee070176c"} Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.479368 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.598311 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.697369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnbvm\" (UniqueName: \"kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm\") pod \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.697999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content\") pod \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.698132 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities\") pod \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\" (UID: \"7a076ee5-710a-40b7-af6d-ef16a3a186f7\") " Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.698929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities" (OuterVolumeSpecName: "utilities") pod "7a076ee5-710a-40b7-af6d-ef16a3a186f7" (UID: "7a076ee5-710a-40b7-af6d-ef16a3a186f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.705052 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm" (OuterVolumeSpecName: "kube-api-access-tnbvm") pod "7a076ee5-710a-40b7-af6d-ef16a3a186f7" (UID: "7a076ee5-710a-40b7-af6d-ef16a3a186f7"). InnerVolumeSpecName "kube-api-access-tnbvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.760882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a076ee5-710a-40b7-af6d-ef16a3a186f7" (UID: "7a076ee5-710a-40b7-af6d-ef16a3a186f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.802938 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.803035 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnbvm\" (UniqueName: \"kubernetes.io/projected/7a076ee5-710a-40b7-af6d-ef16a3a186f7-kube-api-access-tnbvm\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.803054 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a076ee5-710a-40b7-af6d-ef16a3a186f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.897029 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901c87c8-63c8-49d0-b9dc-1c1729ae2e91" path="/var/lib/kubelet/pods/901c87c8-63c8-49d0-b9dc-1c1729ae2e91/volumes" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.970788 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerID="3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f" exitCode=0 Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.970853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerDied","Data":"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f"} Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.970882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95h5" event={"ID":"7a076ee5-710a-40b7-af6d-ef16a3a186f7","Type":"ContainerDied","Data":"0d06b84b7bd1e1e1fa9e7e7653dca4d8788fac49ad0e0e0a06d288c88d029cdd"} Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.970900 4755 scope.go:117] "RemoveContainer" containerID="3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.971030 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95h5" Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.975380 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerStarted","Data":"903a5f50863aa2a622f7366f57eb24767cef143fc269fb6f6fd9da06a949523f"} Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.975549 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-log" containerID="cri-o://76483577e1734d2a556949998130be6d9eea8b6ccc077c813eda32aee070176c" gracePeriod=30 Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.975653 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-httpd" containerID="cri-o://903a5f50863aa2a622f7366f57eb24767cef143fc269fb6f6fd9da06a949523f" gracePeriod=30 Oct 06 09:10:41 crc kubenswrapper[4755]: I1006 09:10:41.977738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"594fa22c-967e-4e3f-aa38-609978425183","Type":"ContainerStarted","Data":"337c9c2b0c169180903dd92eae45e5329bdb7691c46d6152677d927cdde83fe1"} Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.010815 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.010698539 podStartE2EDuration="5.010698539s" podCreationTimestamp="2025-10-06 09:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:10:42.009331465 +0000 UTC m=+2898.838646679" watchObservedRunningTime="2025-10-06 09:10:42.010698539 +0000 UTC m=+2898.840013753" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.039980 4755 scope.go:117] "RemoveContainer" containerID="25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.073748 4755 scope.go:117] "RemoveContainer" containerID="11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.083654 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.103301 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p95h5"] Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.232062 4755 scope.go:117] "RemoveContainer" containerID="3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f" Oct 06 09:10:42 crc kubenswrapper[4755]: E1006 09:10:42.248113 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f\": container with ID starting with 3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f not found: ID does not exist" containerID="3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.248154 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f"} err="failed to get container status \"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f\": rpc error: code = NotFound desc = could not find container \"3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f\": container with ID starting with 3b6f09f4ece78e1ef831074413ebba2deb4a24632b778e73e3459006cacaac8f not found: ID does not exist" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.248182 4755 scope.go:117] "RemoveContainer" containerID="25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2" Oct 06 09:10:42 crc kubenswrapper[4755]: E1006 09:10:42.249670 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2\": container with ID starting with 25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2 not found: ID does not exist" containerID="25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.249727 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2"} err="failed to get container status \"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2\": rpc error: code = NotFound desc = could not find container \"25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2\": container with ID starting with 25467b052ec1fb991ee20812b92f4197a2083e3a3dd312c997eb4306a35aefc2 not found: ID does not exist" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.249748 4755 scope.go:117] "RemoveContainer" containerID="11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21" Oct 06 09:10:42 crc kubenswrapper[4755]: E1006 09:10:42.250093 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21\": container with ID starting with 11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21 not found: ID does not exist" containerID="11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21" Oct 06 09:10:42 crc kubenswrapper[4755]: I1006 09:10:42.250136 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21"} err="failed to get container status \"11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21\": rpc error: code = NotFound desc = could not find container \"11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21\": container with ID starting with 11096304ac2c7aebbf04b8e8c2746c14cec0847752ef7470acb4fa1e97926f21 not found: ID does not exist" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.034362 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded49320-06a2-415a-ab06-57d9c2969976" containerID="903a5f50863aa2a622f7366f57eb24767cef143fc269fb6f6fd9da06a949523f" exitCode=0 Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.035299 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded49320-06a2-415a-ab06-57d9c2969976" containerID="76483577e1734d2a556949998130be6d9eea8b6ccc077c813eda32aee070176c" exitCode=143 Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.034437 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerDied","Data":"903a5f50863aa2a622f7366f57eb24767cef143fc269fb6f6fd9da06a949523f"} Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.035447 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerDied","Data":"76483577e1734d2a556949998130be6d9eea8b6ccc077c813eda32aee070176c"} Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.039472 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"594fa22c-967e-4e3f-aa38-609978425183","Type":"ContainerStarted","Data":"c23115b305a14a0e92c0f798a3addba68bdcb7c6cf5e6cff7d16291ab3c763fd"} Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.186079 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.339815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.340627 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.340677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.340851 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.340894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.341002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.341046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.341172 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.341214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhl62\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62\") pod \"ded49320-06a2-415a-ab06-57d9c2969976\" (UID: \"ded49320-06a2-415a-ab06-57d9c2969976\") " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.342306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.342649 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs" (OuterVolumeSpecName: "logs") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.348374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts" (OuterVolumeSpecName: "scripts") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.348746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62" (OuterVolumeSpecName: "kube-api-access-mhl62") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "kube-api-access-mhl62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.348970 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.355736 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph" (OuterVolumeSpecName: "ceph") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.391272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.414497 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450348 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450412 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ded49320-06a2-415a-ab06-57d9c2969976-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450435 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhl62\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-kube-api-access-mhl62\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450508 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450523 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ded49320-06a2-415a-ab06-57d9c2969976-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450536 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450553 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.450591 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.469973 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data" (OuterVolumeSpecName: "config-data") pod "ded49320-06a2-415a-ab06-57d9c2969976" (UID: "ded49320-06a2-415a-ab06-57d9c2969976"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.507497 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.552325 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.552356 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded49320-06a2-415a-ab06-57d9c2969976-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:43 crc kubenswrapper[4755]: I1006 09:10:43.901180 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" path="/var/lib/kubelet/pods/7a076ee5-710a-40b7-af6d-ef16a3a186f7/volumes" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.060203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ded49320-06a2-415a-ab06-57d9c2969976","Type":"ContainerDied","Data":"91485a3987e0a0cb2e4d38fc1080368059a68930b5bee704111a2fb808882f7e"} Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.060206 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.060280 4755 scope.go:117] "RemoveContainer" containerID="903a5f50863aa2a622f7366f57eb24767cef143fc269fb6f6fd9da06a949523f" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.067586 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"594fa22c-967e-4e3f-aa38-609978425183","Type":"ContainerStarted","Data":"aa3c78918ae7cf36265e3e81f1ebfd82a4b8d02726ce66ed2a9afa221e4bdc9f"} Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.094934 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.116851 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.137408 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:44 crc kubenswrapper[4755]: E1006 09:10:44.138225 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="registry-server" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138257 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="registry-server" Oct 06 09:10:44 crc kubenswrapper[4755]: E1006 09:10:44.138270 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="extract-content" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138280 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="extract-content" Oct 06 09:10:44 crc kubenswrapper[4755]: E1006 09:10:44.138317 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-log" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138326 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-log" Oct 06 09:10:44 crc kubenswrapper[4755]: E1006 09:10:44.138341 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="extract-utilities" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138348 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="extract-utilities" Oct 06 09:10:44 crc kubenswrapper[4755]: E1006 09:10:44.138368 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-httpd" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138376 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-httpd" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138724 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-httpd" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138749 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded49320-06a2-415a-ab06-57d9c2969976" containerName="glance-log" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.138780 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a076ee5-710a-40b7-af6d-ef16a3a186f7" containerName="registry-server" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.140480 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.144748 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.151572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.160499 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.164306 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.164264008 podStartE2EDuration="4.164264008s" podCreationTimestamp="2025-10-06 09:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:10:44.129609629 +0000 UTC m=+2900.958924863" watchObservedRunningTime="2025-10-06 09:10:44.164264008 +0000 UTC m=+2900.993579222" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.276876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.276990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277198 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6p7\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-kube-api-access-pq6p7\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277394 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.277434 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380528 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6p7\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-kube-api-access-pq6p7\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.380929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.381073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.381138 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.381903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.383463 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6b8f7e-5571-4367-ad0c-76688c6968d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.384336 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.390088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.392930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.393597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.418858 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6b8f7e-5571-4367-ad0c-76688c6968d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.419030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.423828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6p7\" (UniqueName: \"kubernetes.io/projected/5f6b8f7e-5571-4367-ad0c-76688c6968d3-kube-api-access-pq6p7\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.434343 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6b8f7e-5571-4367-ad0c-76688c6968d3\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:10:44 crc kubenswrapper[4755]: I1006 09:10:44.475301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.241628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.346613 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.590152 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-d472-account-create-rbm5d"] Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.591466 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.597295 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.609825 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d472-account-create-rbm5d"] Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.723402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69zfm\" (UniqueName: \"kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm\") pod \"manila-d472-account-create-rbm5d\" (UID: \"1ce0f617-8015-439d-a175-a596684cf8b9\") " pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.825485 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69zfm\" (UniqueName: \"kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm\") pod \"manila-d472-account-create-rbm5d\" (UID: \"1ce0f617-8015-439d-a175-a596684cf8b9\") " pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.844096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69zfm\" (UniqueName: \"kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm\") pod \"manila-d472-account-create-rbm5d\" (UID: \"1ce0f617-8015-439d-a175-a596684cf8b9\") " pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.894453 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded49320-06a2-415a-ab06-57d9c2969976" path="/var/lib/kubelet/pods/ded49320-06a2-415a-ab06-57d9c2969976/volumes" Oct 06 09:10:45 crc kubenswrapper[4755]: I1006 09:10:45.929350 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:48 crc kubenswrapper[4755]: I1006 09:10:48.912431 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:10:48 crc kubenswrapper[4755]: I1006 09:10:48.912751 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:10:48 crc kubenswrapper[4755]: I1006 09:10:48.912801 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:10:48 crc kubenswrapper[4755]: I1006 09:10:48.913681 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:10:48 crc kubenswrapper[4755]: I1006 09:10:48.913751 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f" gracePeriod=600 Oct 06 09:10:49 crc kubenswrapper[4755]: I1006 09:10:49.147504 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f" exitCode=0 Oct 06 09:10:49 crc kubenswrapper[4755]: I1006 09:10:49.147556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f"} Oct 06 09:10:49 crc kubenswrapper[4755]: I1006 09:10:49.488422 4755 scope.go:117] "RemoveContainer" containerID="76483577e1734d2a556949998130be6d9eea8b6ccc077c813eda32aee070176c" Oct 06 09:10:49 crc kubenswrapper[4755]: I1006 09:10:49.586266 4755 scope.go:117] "RemoveContainer" containerID="e4232d015c8563607baff9c6312492437dcfaa3d98703a9e49606dd4d03c612a" Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.101490 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:10:50 crc kubenswrapper[4755]: W1006 09:10:50.111822 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6b8f7e_5571_4367_ad0c_76688c6968d3.slice/crio-8ab4fa4905ba07033a88dbe0f323808cc3a644d0bb3e3544c49fc875e25488d3 WatchSource:0}: Error finding container 8ab4fa4905ba07033a88dbe0f323808cc3a644d0bb3e3544c49fc875e25488d3: Status 404 returned error can't find the container with id 8ab4fa4905ba07033a88dbe0f323808cc3a644d0bb3e3544c49fc875e25488d3 Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.142859 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d472-account-create-rbm5d"] Oct 06 09:10:50 crc kubenswrapper[4755]: W1006 09:10:50.154735 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ce0f617_8015_439d_a175_a596684cf8b9.slice/crio-e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc WatchSource:0}: Error finding container e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc: Status 404 returned error can't find the container with id e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.159172 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerStarted","Data":"61411b67732887bcdc8d4e37646400bf3f91d466d5fa6ed492499050944c1cdc"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.160920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerStarted","Data":"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.163025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerStarted","Data":"5d6b234790e12896944731749531e5d266893f4eb87778ca74fa9ee384174e12"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.165587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6b8f7e-5571-4367-ad0c-76688c6968d3","Type":"ContainerStarted","Data":"8ab4fa4905ba07033a88dbe0f323808cc3a644d0bb3e3544c49fc875e25488d3"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.170493 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5797d74dbd-6v4nj" event={"ID":"3583b65a-632c-4988-81fa-d1ee08e8f258","Type":"ContainerStarted","Data":"547dbdf25e10793aec5651a5167ca2487cc44814f0aab3e5bce2972990092fa4"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.174743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20"} Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.697048 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.697684 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.931370 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:10:50 crc kubenswrapper[4755]: I1006 09:10:50.934201 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.200534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerStarted","Data":"111525c13ac7319566438bced69942dca4a3e4f4a6315117cb43762a400563ea"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.200915 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85c75dc44f-w5tzr" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon-log" containerID="cri-o://5d6b234790e12896944731749531e5d266893f4eb87778ca74fa9ee384174e12" gracePeriod=30 Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.201143 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85c75dc44f-w5tzr" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon" containerID="cri-o://111525c13ac7319566438bced69942dca4a3e4f4a6315117cb43762a400563ea" gracePeriod=30 Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.205335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6b8f7e-5571-4367-ad0c-76688c6968d3","Type":"ContainerStarted","Data":"ee1447baab9db9e13de28c845220dd57dd34d56a23b9a7123dc25102094657f6"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.208336 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5797d74dbd-6v4nj" event={"ID":"3583b65a-632c-4988-81fa-d1ee08e8f258","Type":"ContainerStarted","Data":"ac4bd0f8416038546d9060e59ace911f0fd1ec47f794eef007fef9886283c8bb"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.210888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerStarted","Data":"2e8ffd8aa0b49e6e6139380bc97256febb67914037cbf29d6a50ee589eefab8e"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.211173 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58d8996c-r5v5j" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon-log" containerID="cri-o://61411b67732887bcdc8d4e37646400bf3f91d466d5fa6ed492499050944c1cdc" gracePeriod=30 Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.211462 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58d8996c-r5v5j" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon" containerID="cri-o://2e8ffd8aa0b49e6e6139380bc97256febb67914037cbf29d6a50ee589eefab8e" gracePeriod=30 Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.216217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerStarted","Data":"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.231360 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85c75dc44f-w5tzr" podStartSLOduration=3.421660356 podStartE2EDuration="16.231342033s" podCreationTimestamp="2025-10-06 09:10:35 +0000 UTC" firstStartedPulling="2025-10-06 09:10:36.796624541 +0000 UTC m=+2893.625939755" lastFinishedPulling="2025-10-06 09:10:49.606306208 +0000 UTC m=+2906.435621432" observedRunningTime="2025-10-06 09:10:51.217728809 +0000 UTC m=+2908.047044023" watchObservedRunningTime="2025-10-06 09:10:51.231342033 +0000 UTC m=+2908.060657247" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.231932 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ce0f617-8015-439d-a175-a596684cf8b9" containerID="86e04bde33c90f3e0e7d0575c2e650677d0948577f4a4f813602fc3d2e24e3f6" exitCode=0 Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.233669 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d472-account-create-rbm5d" event={"ID":"1ce0f617-8015-439d-a175-a596684cf8b9","Type":"ContainerDied","Data":"86e04bde33c90f3e0e7d0575c2e650677d0948577f4a4f813602fc3d2e24e3f6"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.234097 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.234180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d472-account-create-rbm5d" event={"ID":"1ce0f617-8015-439d-a175-a596684cf8b9","Type":"ContainerStarted","Data":"e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc"} Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.234882 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.253786 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77dcf6c7d-x7gnh" podStartSLOduration=3.189274369 podStartE2EDuration="13.253766003s" podCreationTimestamp="2025-10-06 09:10:38 +0000 UTC" firstStartedPulling="2025-10-06 09:10:39.599230063 +0000 UTC m=+2896.428545267" lastFinishedPulling="2025-10-06 09:10:49.663721687 +0000 UTC m=+2906.493036901" observedRunningTime="2025-10-06 09:10:51.249019207 +0000 UTC m=+2908.078334451" watchObservedRunningTime="2025-10-06 09:10:51.253766003 +0000 UTC m=+2908.083081217" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.284117 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5797d74dbd-6v4nj" podStartSLOduration=3.44128453 podStartE2EDuration="13.284098107s" podCreationTimestamp="2025-10-06 09:10:38 +0000 UTC" firstStartedPulling="2025-10-06 09:10:39.810357122 +0000 UTC m=+2896.639672336" lastFinishedPulling="2025-10-06 09:10:49.653170689 +0000 UTC m=+2906.482485913" observedRunningTime="2025-10-06 09:10:51.273667841 +0000 UTC m=+2908.102983065" watchObservedRunningTime="2025-10-06 09:10:51.284098107 +0000 UTC m=+2908.113413321" Oct 06 09:10:51 crc kubenswrapper[4755]: I1006 09:10:51.303500 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58d8996c-r5v5j" podStartSLOduration=3.280263417 podStartE2EDuration="16.303479502s" podCreationTimestamp="2025-10-06 09:10:35 +0000 UTC" firstStartedPulling="2025-10-06 09:10:36.641594669 +0000 UTC m=+2893.470909883" lastFinishedPulling="2025-10-06 09:10:49.664810754 +0000 UTC m=+2906.494125968" observedRunningTime="2025-10-06 09:10:51.301234498 +0000 UTC m=+2908.130549722" watchObservedRunningTime="2025-10-06 09:10:51.303479502 +0000 UTC m=+2908.132794716" Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.245878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6b8f7e-5571-4367-ad0c-76688c6968d3","Type":"ContainerStarted","Data":"55948e4d089c5e8b0e0d75130c51908ac0a8a465b47349df6b6a516b251b242f"} Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.327262 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.327244257 podStartE2EDuration="8.327244257s" podCreationTimestamp="2025-10-06 09:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:10:52.291445128 +0000 UTC m=+2909.120760352" watchObservedRunningTime="2025-10-06 09:10:52.327244257 +0000 UTC m=+2909.156559471" Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.726184 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.750393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69zfm\" (UniqueName: \"kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm\") pod \"1ce0f617-8015-439d-a175-a596684cf8b9\" (UID: \"1ce0f617-8015-439d-a175-a596684cf8b9\") " Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.768493 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm" (OuterVolumeSpecName: "kube-api-access-69zfm") pod "1ce0f617-8015-439d-a175-a596684cf8b9" (UID: "1ce0f617-8015-439d-a175-a596684cf8b9"). InnerVolumeSpecName "kube-api-access-69zfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:52 crc kubenswrapper[4755]: I1006 09:10:52.853088 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69zfm\" (UniqueName: \"kubernetes.io/projected/1ce0f617-8015-439d-a175-a596684cf8b9-kube-api-access-69zfm\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.255404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d472-account-create-rbm5d" event={"ID":"1ce0f617-8015-439d-a175-a596684cf8b9","Type":"ContainerDied","Data":"e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc"} Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.255456 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e94509cb9357168238e96d739b8e981d7e68675f645964c2a86de37d15e8b2cc" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.255588 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.255598 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.255587 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d472-account-create-rbm5d" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.724010 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:10:53 crc kubenswrapper[4755]: I1006 09:10:53.726259 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:10:54 crc kubenswrapper[4755]: I1006 09:10:54.476836 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:54 crc kubenswrapper[4755]: I1006 09:10:54.477176 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:54 crc kubenswrapper[4755]: I1006 09:10:54.529272 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:54 crc kubenswrapper[4755]: I1006 09:10:54.539508 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:55 crc kubenswrapper[4755]: I1006 09:10:55.274186 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:55 crc kubenswrapper[4755]: I1006 09:10:55.274823 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.043542 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.073583 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-m6zbn"] Oct 06 09:10:56 crc kubenswrapper[4755]: E1006 09:10:56.074196 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce0f617-8015-439d-a175-a596684cf8b9" containerName="mariadb-account-create" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.074219 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce0f617-8015-439d-a175-a596684cf8b9" containerName="mariadb-account-create" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.074494 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce0f617-8015-439d-a175-a596684cf8b9" containerName="mariadb-account-create" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.075410 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.080610 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.080883 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-5jqqk" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.089755 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-m6zbn"] Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.124692 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.124733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.124828 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgvwr\" (UniqueName: \"kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.124876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.164029 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.226940 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgvwr\" (UniqueName: \"kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.227023 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.227083 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.227104 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.233105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.237359 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.240950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.267226 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgvwr\" (UniqueName: \"kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr\") pod \"manila-db-sync-m6zbn\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:56 crc kubenswrapper[4755]: I1006 09:10:56.403481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m6zbn" Oct 06 09:10:57 crc kubenswrapper[4755]: I1006 09:10:57.063555 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-m6zbn"] Oct 06 09:10:57 crc kubenswrapper[4755]: W1006 09:10:57.077705 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64aecb78_16d1_476d_852e_9891c850994e.slice/crio-fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4 WatchSource:0}: Error finding container fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4: Status 404 returned error can't find the container with id fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4 Oct 06 09:10:57 crc kubenswrapper[4755]: I1006 09:10:57.293404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m6zbn" event={"ID":"64aecb78-16d1-476d-852e-9891c850994e","Type":"ContainerStarted","Data":"fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4"} Oct 06 09:10:57 crc kubenswrapper[4755]: I1006 09:10:57.949500 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:58 crc kubenswrapper[4755]: I1006 09:10:58.423830 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:10:58 crc kubenswrapper[4755]: I1006 09:10:58.928317 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:58 crc kubenswrapper[4755]: I1006 09:10:58.928617 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:10:59 crc kubenswrapper[4755]: I1006 09:10:59.131779 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:10:59 crc kubenswrapper[4755]: I1006 09:10:59.131832 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:11:02 crc kubenswrapper[4755]: I1006 09:11:02.340116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m6zbn" event={"ID":"64aecb78-16d1-476d-852e-9891c850994e","Type":"ContainerStarted","Data":"a75777935df112ed2664c3f92b3389441eb91c509cd3313efc81c0c67d8fdf19"} Oct 06 09:11:02 crc kubenswrapper[4755]: I1006 09:11:02.364597 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-m6zbn" podStartSLOduration=1.738548272 podStartE2EDuration="6.364575514s" podCreationTimestamp="2025-10-06 09:10:56 +0000 UTC" firstStartedPulling="2025-10-06 09:10:57.07926572 +0000 UTC m=+2913.908580934" lastFinishedPulling="2025-10-06 09:11:01.705292972 +0000 UTC m=+2918.534608176" observedRunningTime="2025-10-06 09:11:02.353237786 +0000 UTC m=+2919.182553000" watchObservedRunningTime="2025-10-06 09:11:02.364575514 +0000 UTC m=+2919.193890738" Oct 06 09:11:08 crc kubenswrapper[4755]: I1006 09:11:08.931432 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 06 09:11:09 crc kubenswrapper[4755]: I1006 09:11:09.135187 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5797d74dbd-6v4nj" podUID="3583b65a-632c-4988-81fa-d1ee08e8f258" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Oct 06 09:11:13 crc kubenswrapper[4755]: I1006 09:11:13.444548 4755 generic.go:334] "Generic (PLEG): container finished" podID="64aecb78-16d1-476d-852e-9891c850994e" containerID="a75777935df112ed2664c3f92b3389441eb91c509cd3313efc81c0c67d8fdf19" exitCode=0 Oct 06 09:11:13 crc kubenswrapper[4755]: I1006 09:11:13.444637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m6zbn" event={"ID":"64aecb78-16d1-476d-852e-9891c850994e","Type":"ContainerDied","Data":"a75777935df112ed2664c3f92b3389441eb91c509cd3313efc81c0c67d8fdf19"} Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.855892 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m6zbn" Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.953009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data\") pod \"64aecb78-16d1-476d-852e-9891c850994e\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.953298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgvwr\" (UniqueName: \"kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr\") pod \"64aecb78-16d1-476d-852e-9891c850994e\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.953995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data\") pod \"64aecb78-16d1-476d-852e-9891c850994e\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.954188 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle\") pod \"64aecb78-16d1-476d-852e-9891c850994e\" (UID: \"64aecb78-16d1-476d-852e-9891c850994e\") " Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.958784 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr" (OuterVolumeSpecName: "kube-api-access-jgvwr") pod "64aecb78-16d1-476d-852e-9891c850994e" (UID: "64aecb78-16d1-476d-852e-9891c850994e"). InnerVolumeSpecName "kube-api-access-jgvwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.959759 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "64aecb78-16d1-476d-852e-9891c850994e" (UID: "64aecb78-16d1-476d-852e-9891c850994e"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:14 crc kubenswrapper[4755]: I1006 09:11:14.965369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data" (OuterVolumeSpecName: "config-data") pod "64aecb78-16d1-476d-852e-9891c850994e" (UID: "64aecb78-16d1-476d-852e-9891c850994e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.005802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64aecb78-16d1-476d-852e-9891c850994e" (UID: "64aecb78-16d1-476d-852e-9891c850994e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.056659 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.056690 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.056700 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgvwr\" (UniqueName: \"kubernetes.io/projected/64aecb78-16d1-476d-852e-9891c850994e-kube-api-access-jgvwr\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.056710 4755 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/64aecb78-16d1-476d-852e-9891c850994e-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.466058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m6zbn" event={"ID":"64aecb78-16d1-476d-852e-9891c850994e","Type":"ContainerDied","Data":"fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4"} Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.466112 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd06d703c7b1e734b75feee8afd2d08a1f1867f10f00d8a2a995016e300a2ed4" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.466157 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m6zbn" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.771784 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:15 crc kubenswrapper[4755]: E1006 09:11:15.772464 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64aecb78-16d1-476d-852e-9891c850994e" containerName="manila-db-sync" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.772491 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="64aecb78-16d1-476d-852e-9891c850994e" containerName="manila-db-sync" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.772770 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="64aecb78-16d1-476d-852e-9891c850994e" containerName="manila-db-sync" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.774513 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.776837 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.779883 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.780089 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.780192 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-5jqqk" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.783709 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886220 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5nks\" (UniqueName: \"kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886249 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886390 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886537 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.886698 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.928765 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.930836 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.931936 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.935721 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.952923 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-nl7nk"] Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.959930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.979511 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-nl7nk"] Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991706 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991741 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991887 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5nks\" (UniqueName: \"kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991928 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69r7l\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.991990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.992035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.992263 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.996722 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:15 crc kubenswrapper[4755]: I1006 09:11:15.997398 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.005830 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.014922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.018403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.022399 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5nks\" (UniqueName: \"kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks\") pod \"manila-scheduler-0\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.070833 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.072491 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.074454 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.088311 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjm95\" (UniqueName: \"kubernetes.io/projected/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-kube-api-access-tjm95\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094813 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-config\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094899 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094916 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094947 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.094975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69r7l\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095006 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095020 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095067 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095137 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.095165 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.101373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.103125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.103303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.103368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.104952 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.105843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.109389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.125516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69r7l\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.135588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data\") pod \"manila-share-share1-0\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjm95\" (UniqueName: \"kubernetes.io/projected/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-kube-api-access-tjm95\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196770 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-config\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196869 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196894 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196952 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.196967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.197018 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gcl\" (UniqueName: \"kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.197088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.197113 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.198062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.198150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-config\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.198450 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.198974 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.200503 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.214913 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjm95\" (UniqueName: \"kubernetes.io/projected/9fe5b7f4-6615-4f73-8116-51fcda3ef59e-kube-api-access-tjm95\") pod \"dnsmasq-dns-76b5fdb995-nl7nk\" (UID: \"9fe5b7f4-6615-4f73-8116-51fcda3ef59e\") " pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.262995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.276816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gcl\" (UniqueName: \"kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299911 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.299970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.300237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.301080 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.308376 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.318617 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.322251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.327272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gcl\" (UniqueName: \"kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.330315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom\") pod \"manila-api-0\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.541586 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.621305 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.794326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-nl7nk"] Oct 06 09:11:16 crc kubenswrapper[4755]: I1006 09:11:16.863465 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:16 crc kubenswrapper[4755]: W1006 09:11:16.875206 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a73e92_20ea_4a05_8c43_e84b9e0bb15d.slice/crio-e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2 WatchSource:0}: Error finding container e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2: Status 404 returned error can't find the container with id e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2 Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.123844 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.492584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerStarted","Data":"6f01f52603cfb7a35a78dabd7b855324cb840f88e0b4c153e53590bfb0fa3080"} Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.498763 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerStarted","Data":"e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2"} Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.501539 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerStarted","Data":"9f4de28f12f48eeae9755ad80150d8f8cfeefcfefd9ecb4b5d9bea10104df9a4"} Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.505388 4755 generic.go:334] "Generic (PLEG): container finished" podID="9fe5b7f4-6615-4f73-8116-51fcda3ef59e" containerID="2d9f7099f1804e5e6283e1b26eb890827c040fd0134d04aa3be68d21a7e6f02a" exitCode=0 Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.505428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" event={"ID":"9fe5b7f4-6615-4f73-8116-51fcda3ef59e","Type":"ContainerDied","Data":"2d9f7099f1804e5e6283e1b26eb890827c040fd0134d04aa3be68d21a7e6f02a"} Oct 06 09:11:17 crc kubenswrapper[4755]: I1006 09:11:17.505458 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" event={"ID":"9fe5b7f4-6615-4f73-8116-51fcda3ef59e","Type":"ContainerStarted","Data":"d58823820b9bad34e6932e0e8a5e3593989d4e7f28c5fd3cd53ba0d497e15484"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.517838 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" event={"ID":"9fe5b7f4-6615-4f73-8116-51fcda3ef59e","Type":"ContainerStarted","Data":"89bd3e5d4a290418a5077ac7c54c82f2a05fb8064efdbb709c9d8837f77027aa"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.520502 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.526757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerStarted","Data":"ab12b21a86beebceabbed399d042518eb1587cac5de069a48abca79b2a75b5e1"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.526793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerStarted","Data":"fd8a6fc4c174ff4e8bab88005a022489b1c6421dde4c5ee907aa47883c67f4ef"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.534224 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerStarted","Data":"5721162b2785f89cee42a6d6bbf6f484432f6337ce69fb6930b7ce6b2db04a57"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.534260 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerStarted","Data":"963d7ab5f96906ca48e47b3684d8708bd4d499039920e9f84bb163057ce4112b"} Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.535006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.541573 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" podStartSLOduration=3.541543885 podStartE2EDuration="3.541543885s" podCreationTimestamp="2025-10-06 09:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:11:18.539447544 +0000 UTC m=+2935.368762748" watchObservedRunningTime="2025-10-06 09:11:18.541543885 +0000 UTC m=+2935.370859099" Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.610687 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.610666801 podStartE2EDuration="2.610666801s" podCreationTimestamp="2025-10-06 09:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:11:18.55846813 +0000 UTC m=+2935.387783344" watchObservedRunningTime="2025-10-06 09:11:18.610666801 +0000 UTC m=+2935.439982015" Oct 06 09:11:18 crc kubenswrapper[4755]: I1006 09:11:18.623095 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.9878920129999997 podStartE2EDuration="3.623071724s" podCreationTimestamp="2025-10-06 09:11:15 +0000 UTC" firstStartedPulling="2025-10-06 09:11:16.625219726 +0000 UTC m=+2933.454534940" lastFinishedPulling="2025-10-06 09:11:17.260399437 +0000 UTC m=+2934.089714651" observedRunningTime="2025-10-06 09:11:18.587266447 +0000 UTC m=+2935.416581671" watchObservedRunningTime="2025-10-06 09:11:18.623071724 +0000 UTC m=+2935.452386938" Oct 06 09:11:19 crc kubenswrapper[4755]: I1006 09:11:19.164375 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.559893 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api-log" containerID="cri-o://963d7ab5f96906ca48e47b3684d8708bd4d499039920e9f84bb163057ce4112b" gracePeriod=30 Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.559995 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api" containerID="cri-o://5721162b2785f89cee42a6d6bbf6f484432f6337ce69fb6930b7ce6b2db04a57" gracePeriod=30 Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.753869 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.754204 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="proxy-httpd" containerID="cri-o://e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912" gracePeriod=30 Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.754270 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="sg-core" containerID="cri-o://4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8" gracePeriod=30 Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.754315 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-notification-agent" containerID="cri-o://5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35" gracePeriod=30 Oct 06 09:11:20 crc kubenswrapper[4755]: I1006 09:11:20.754192 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-central-agent" containerID="cri-o://638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998" gracePeriod=30 Oct 06 09:11:21 crc kubenswrapper[4755]: I1006 09:11:21.405537 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.442182 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.578550 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerID="2e8ffd8aa0b49e6e6139380bc97256febb67914037cbf29d6a50ee589eefab8e" exitCode=137 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.578615 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerID="61411b67732887bcdc8d4e37646400bf3f91d466d5fa6ed492499050944c1cdc" exitCode=137 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.578655 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerDied","Data":"2e8ffd8aa0b49e6e6139380bc97256febb67914037cbf29d6a50ee589eefab8e"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.578683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerDied","Data":"61411b67732887bcdc8d4e37646400bf3f91d466d5fa6ed492499050944c1cdc"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585590 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerID="e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912" exitCode=0 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585613 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerID="4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8" exitCode=2 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585622 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerID="638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998" exitCode=0 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerDied","Data":"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585718 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerDied","Data":"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.585731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerDied","Data":"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.588581 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerID="111525c13ac7319566438bced69942dca4a3e4f4a6315117cb43762a400563ea" exitCode=137 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.588596 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerID="5d6b234790e12896944731749531e5d266893f4eb87778ca74fa9ee384174e12" exitCode=137 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.588637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerDied","Data":"111525c13ac7319566438bced69942dca4a3e4f4a6315117cb43762a400563ea"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.588679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerDied","Data":"5d6b234790e12896944731749531e5d266893f4eb87778ca74fa9ee384174e12"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.592731 4755 generic.go:334] "Generic (PLEG): container finished" podID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerID="5721162b2785f89cee42a6d6bbf6f484432f6337ce69fb6930b7ce6b2db04a57" exitCode=0 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.592746 4755 generic.go:334] "Generic (PLEG): container finished" podID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerID="963d7ab5f96906ca48e47b3684d8708bd4d499039920e9f84bb163057ce4112b" exitCode=143 Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.592783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerDied","Data":"5721162b2785f89cee42a6d6bbf6f484432f6337ce69fb6930b7ce6b2db04a57"} Oct 06 09:11:22 crc kubenswrapper[4755]: I1006 09:11:21.592855 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerDied","Data":"963d7ab5f96906ca48e47b3684d8708bd4d499039920e9f84bb163057ce4112b"} Oct 06 09:11:23 crc kubenswrapper[4755]: I1006 09:11:23.225105 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:11:23 crc kubenswrapper[4755]: I1006 09:11:23.352239 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5797d74dbd-6v4nj" Oct 06 09:11:23 crc kubenswrapper[4755]: I1006 09:11:23.413084 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:11:23 crc kubenswrapper[4755]: I1006 09:11:23.610849 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon-log" containerID="cri-o://75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb" gracePeriod=30 Oct 06 09:11:23 crc kubenswrapper[4755]: I1006 09:11:23.611073 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" containerID="cri-o://389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73" gracePeriod=30 Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.619233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.628462 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.647494 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d06bd886-edb5-4a47-9ad7-9290ec0945c1","Type":"ContainerDied","Data":"9f4de28f12f48eeae9755ad80150d8f8cfeefcfefd9ecb4b5d9bea10104df9a4"} Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.647551 4755 scope.go:117] "RemoveContainer" containerID="5721162b2785f89cee42a6d6bbf6f484432f6337ce69fb6930b7ce6b2db04a57" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.647780 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.652263 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.654964 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d8996c-r5v5j" event={"ID":"a6f52a11-abe0-44d2-b543-80fd120a6299","Type":"ContainerDied","Data":"f6b9641bb8f0ea06103925627624d904a3504a2310b201e3b70794d7120649c6"} Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.669514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85c75dc44f-w5tzr" event={"ID":"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b","Type":"ContainerDied","Data":"bc56fafe4d725aa171975511cb0de6360f5b52bfa0aca109a5154e4bec800a9a"} Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.669613 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85c75dc44f-w5tzr" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.696757 4755 scope.go:117] "RemoveContainer" containerID="963d7ab5f96906ca48e47b3684d8708bd4d499039920e9f84bb163057ce4112b" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718449 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8gcl\" (UniqueName: \"kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718616 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smn7b\" (UniqueName: \"kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b\") pod \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718634 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718663 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts\") pod \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs\") pod \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718728 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id\") pod \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\" (UID: \"d06bd886-edb5-4a47-9ad7-9290ec0945c1\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718834 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key\") pod \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data\") pod \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\" (UID: \"2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs" (OuterVolumeSpecName: "logs") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.718941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.719351 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06bd886-edb5-4a47-9ad7-9290ec0945c1-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.719369 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06bd886-edb5-4a47-9ad7-9290ec0945c1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.719713 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs" (OuterVolumeSpecName: "logs") pod "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" (UID: "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.721554 4755 scope.go:117] "RemoveContainer" containerID="2e8ffd8aa0b49e6e6139380bc97256febb67914037cbf29d6a50ee589eefab8e" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.724814 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl" (OuterVolumeSpecName: "kube-api-access-x8gcl") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "kube-api-access-x8gcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.726152 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.726282 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts" (OuterVolumeSpecName: "scripts") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.728668 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b" (OuterVolumeSpecName: "kube-api-access-smn7b") pod "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" (UID: "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b"). InnerVolumeSpecName "kube-api-access-smn7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.728668 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" (UID: "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.754949 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts" (OuterVolumeSpecName: "scripts") pod "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" (UID: "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.759613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.765275 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data" (OuterVolumeSpecName: "config-data") pod "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" (UID: "2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.785030 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data" (OuterVolumeSpecName: "config-data") pod "d06bd886-edb5-4a47-9ad7-9290ec0945c1" (UID: "d06bd886-edb5-4a47-9ad7-9290ec0945c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.820848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts\") pod \"a6f52a11-abe0-44d2-b543-80fd120a6299\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.820995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs\") pod \"a6f52a11-abe0-44d2-b543-80fd120a6299\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821101 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data\") pod \"a6f52a11-abe0-44d2-b543-80fd120a6299\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821133 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzbh2\" (UniqueName: \"kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2\") pod \"a6f52a11-abe0-44d2-b543-80fd120a6299\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key\") pod \"a6f52a11-abe0-44d2-b543-80fd120a6299\" (UID: \"a6f52a11-abe0-44d2-b543-80fd120a6299\") " Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821792 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smn7b\" (UniqueName: \"kubernetes.io/projected/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-kube-api-access-smn7b\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821817 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821831 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821844 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821854 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821865 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821877 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821890 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821903 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8gcl\" (UniqueName: \"kubernetes.io/projected/d06bd886-edb5-4a47-9ad7-9290ec0945c1-kube-api-access-x8gcl\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.821914 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06bd886-edb5-4a47-9ad7-9290ec0945c1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.822955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs" (OuterVolumeSpecName: "logs") pod "a6f52a11-abe0-44d2-b543-80fd120a6299" (UID: "a6f52a11-abe0-44d2-b543-80fd120a6299"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.825091 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a6f52a11-abe0-44d2-b543-80fd120a6299" (UID: "a6f52a11-abe0-44d2-b543-80fd120a6299"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.825091 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2" (OuterVolumeSpecName: "kube-api-access-qzbh2") pod "a6f52a11-abe0-44d2-b543-80fd120a6299" (UID: "a6f52a11-abe0-44d2-b543-80fd120a6299"). InnerVolumeSpecName "kube-api-access-qzbh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.844492 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts" (OuterVolumeSpecName: "scripts") pod "a6f52a11-abe0-44d2-b543-80fd120a6299" (UID: "a6f52a11-abe0-44d2-b543-80fd120a6299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.849790 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data" (OuterVolumeSpecName: "config-data") pod "a6f52a11-abe0-44d2-b543-80fd120a6299" (UID: "a6f52a11-abe0-44d2-b543-80fd120a6299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.902243 4755 scope.go:117] "RemoveContainer" containerID="61411b67732887bcdc8d4e37646400bf3f91d466d5fa6ed492499050944c1cdc" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924536 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924587 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f52a11-abe0-44d2-b543-80fd120a6299-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924601 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f52a11-abe0-44d2-b543-80fd120a6299-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924615 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzbh2\" (UniqueName: \"kubernetes.io/projected/a6f52a11-abe0-44d2-b543-80fd120a6299-kube-api-access-qzbh2\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924629 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f52a11-abe0-44d2-b543-80fd120a6299-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.924767 4755 scope.go:117] "RemoveContainer" containerID="111525c13ac7319566438bced69942dca4a3e4f4a6315117cb43762a400563ea" Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.986592 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:24 crc kubenswrapper[4755]: I1006 09:11:24.998893 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.106770 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109287 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109317 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api-log" Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109356 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109390 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109429 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109438 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api" Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109474 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109482 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109513 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109521 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: E1006 09:11:25.109662 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.109686 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.110785 4755 scope.go:117] "RemoveContainer" containerID="5d6b234790e12896944731749531e5d266893f4eb87778ca74fa9ee384174e12" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111065 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111377 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111414 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111456 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api-log" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111475 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" containerName="horizon" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.111499 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" containerName="manila-api" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.114078 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.115285 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.116289 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.117248 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.122658 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.134278 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.148166 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85c75dc44f-w5tzr"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236363 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-scripts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1660df-89ac-403d-8343-195b26f04e5e-etc-machine-id\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236843 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1660df-89ac-403d-8343-195b26f04e5e-logs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data-custom\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236900 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zts\" (UniqueName: \"kubernetes.io/projected/2c1660df-89ac-403d-8343-195b26f04e5e-kube-api-access-z8zts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-public-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.236974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data-custom\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zts\" (UniqueName: \"kubernetes.io/projected/2c1660df-89ac-403d-8343-195b26f04e5e-kube-api-access-z8zts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338902 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-public-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.338981 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.339026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-scripts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.339080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1660df-89ac-403d-8343-195b26f04e5e-etc-machine-id\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.339105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1660df-89ac-403d-8343-195b26f04e5e-logs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.339462 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1660df-89ac-403d-8343-195b26f04e5e-logs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.339505 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1660df-89ac-403d-8343-195b26f04e5e-etc-machine-id\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.343372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.343908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.344335 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-public-tls-certs\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.346908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-scripts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.347066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-config-data-custom\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.357116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zts\" (UniqueName: \"kubernetes.io/projected/2c1660df-89ac-403d-8343-195b26f04e5e-kube-api-access-z8zts\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.357649 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1660df-89ac-403d-8343-195b26f04e5e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2c1660df-89ac-403d-8343-195b26f04e5e\") " pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.437744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.682215 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d8996c-r5v5j" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.687207 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerStarted","Data":"8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852"} Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.687249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerStarted","Data":"e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba"} Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.720776 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.098953938 podStartE2EDuration="10.720715049s" podCreationTimestamp="2025-10-06 09:11:15 +0000 UTC" firstStartedPulling="2025-10-06 09:11:16.881742458 +0000 UTC m=+2933.711057692" lastFinishedPulling="2025-10-06 09:11:24.503503589 +0000 UTC m=+2941.332818803" observedRunningTime="2025-10-06 09:11:25.708688683 +0000 UTC m=+2942.538003917" watchObservedRunningTime="2025-10-06 09:11:25.720715049 +0000 UTC m=+2942.550030263" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.731727 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.738826 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58d8996c-r5v5j"] Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.893962 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b" path="/var/lib/kubelet/pods/2ffa4bf1-4928-4e5c-acf2-1ff7ffaec08b/volumes" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.895320 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f52a11-abe0-44d2-b543-80fd120a6299" path="/var/lib/kubelet/pods/a6f52a11-abe0-44d2-b543-80fd120a6299/volumes" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.896712 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06bd886-edb5-4a47-9ad7-9290ec0945c1" path="/var/lib/kubelet/pods/d06bd886-edb5-4a47-9ad7-9290ec0945c1/volumes" Oct 06 09:11:25 crc kubenswrapper[4755]: I1006 09:11:25.994268 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.106381 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.264514 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.281005 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-nl7nk" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.355093 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.355373 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="dnsmasq-dns" containerID="cri-o://bde12b463f3c2378a7a470cc022aa6a6180144213d828e3cd662be9efded7378" gracePeriod=10 Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.607196 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.701827 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerID="5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35" exitCode=0 Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.701925 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerDied","Data":"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35"} Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.701984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae6bbc1-632c-4769-9fcf-b7689df07c49","Type":"ContainerDied","Data":"c45253b3ae45fd1897679d748a79c3679c864244dbe43e86ce67d7c4782464a9"} Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.702007 4755 scope.go:117] "RemoveContainer" containerID="e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.702212 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.712278 4755 generic.go:334] "Generic (PLEG): container finished" podID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerID="bde12b463f3c2378a7a470cc022aa6a6180144213d828e3cd662be9efded7378" exitCode=0 Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.712340 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" event={"ID":"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24","Type":"ContainerDied","Data":"bde12b463f3c2378a7a470cc022aa6a6180144213d828e3cd662be9efded7378"} Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.722401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2c1660df-89ac-403d-8343-195b26f04e5e","Type":"ContainerStarted","Data":"9f0a6a132f5b9a6a40fd05e46f70a5792ea33df434553b4c025d0fc54848cb2f"} Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.722433 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2c1660df-89ac-403d-8343-195b26f04e5e","Type":"ContainerStarted","Data":"7c06018a3754e3dbbc83515c9399f91a9f3f1e5116d953dc1b8ca0feb0d19b35"} Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.772792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86dfd\" (UniqueName: \"kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.772854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.772886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.772914 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.772979 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.773110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.773233 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.773275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml\") pod \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\" (UID: \"1ae6bbc1-632c-4769-9fcf-b7689df07c49\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.777227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.780222 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.783791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd" (OuterVolumeSpecName: "kube-api-access-86dfd") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "kube-api-access-86dfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.790593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts" (OuterVolumeSpecName: "scripts") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.822904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.876159 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.876187 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86dfd\" (UniqueName: \"kubernetes.io/projected/1ae6bbc1-632c-4769-9fcf-b7689df07c49-kube-api-access-86dfd\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.876201 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.876209 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.876220 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae6bbc1-632c-4769-9fcf-b7689df07c49-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.900605 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.901409 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.911730 4755 scope.go:117] "RemoveContainer" containerID="4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.916046 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.942637 4755 scope.go:117] "RemoveContainer" containerID="5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.974045 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data" (OuterVolumeSpecName: "config-data") pod "1ae6bbc1-632c-4769-9fcf-b7689df07c49" (UID: "1ae6bbc1-632c-4769-9fcf-b7689df07c49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978029 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkv88\" (UniqueName: \"kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978379 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config\") pod \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\" (UID: \"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24\") " Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978876 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978898 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.978910 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae6bbc1-632c-4769-9fcf-b7689df07c49-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:26 crc kubenswrapper[4755]: I1006 09:11:26.990634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88" (OuterVolumeSpecName: "kube-api-access-gkv88") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "kube-api-access-gkv88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.010949 4755 scope.go:117] "RemoveContainer" containerID="638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.038948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.069211 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.082391 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.085193 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.085227 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkv88\" (UniqueName: \"kubernetes.io/projected/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-kube-api-access-gkv88\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.085238 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.085249 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.097304 4755 scope.go:117] "RemoveContainer" containerID="e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.101550 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.102144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config" (OuterVolumeSpecName: "config") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.111082 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" (UID: "5e4b100e-d1f9-4bed-a11a-a6d3d593cc24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.113952 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912\": container with ID starting with e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912 not found: ID does not exist" containerID="e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.113992 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912"} err="failed to get container status \"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912\": rpc error: code = NotFound desc = could not find container \"e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912\": container with ID starting with e5e67c6b6382ae2fb0033e1b5d33da861fb9e8c4b2b853df46a06fe5171ae912 not found: ID does not exist" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.114016 4755 scope.go:117] "RemoveContainer" containerID="4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.119797 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8\": container with ID starting with 4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8 not found: ID does not exist" containerID="4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.120047 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8"} err="failed to get container status \"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8\": rpc error: code = NotFound desc = could not find container \"4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8\": container with ID starting with 4e38313bd3efd92a8f583d3a4edd56baf0d3c3f2ea4fb66d123230e6f4d045f8 not found: ID does not exist" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.120170 4755 scope.go:117] "RemoveContainer" containerID="5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.122765 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35\": container with ID starting with 5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35 not found: ID does not exist" containerID="5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.122827 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35"} err="failed to get container status \"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35\": rpc error: code = NotFound desc = could not find container \"5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35\": container with ID starting with 5d8da95c99958715be84162c8afef341e752202f8105a441edf06b75cabbac35 not found: ID does not exist" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.122854 4755 scope.go:117] "RemoveContainer" containerID="638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.123823 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998\": container with ID starting with 638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998 not found: ID does not exist" containerID="638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.123871 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998"} err="failed to get container status \"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998\": rpc error: code = NotFound desc = could not find container \"638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998\": container with ID starting with 638f63dfe9467608b2ac9d625945426638e8ba88a8e26e39df89a6c21cccf998 not found: ID does not exist" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.126463 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.140355 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.142866 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="proxy-httpd" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.142894 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="proxy-httpd" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.142946 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="init" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.142960 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="init" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.143237 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-central-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.143257 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-central-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.143268 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-notification-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.143276 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-notification-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.143292 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="dnsmasq-dns" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.143300 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="dnsmasq-dns" Oct 06 09:11:27 crc kubenswrapper[4755]: E1006 09:11:27.143318 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="sg-core" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.143325 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="sg-core" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.146742 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-notification-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.146766 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="sg-core" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.146784 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="proxy-httpd" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.146811 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" containerName="dnsmasq-dns" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.146825 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" containerName="ceilometer-central-agent" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.148965 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.149330 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.151270 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.151491 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.155148 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.187350 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.187683 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.289336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.289542 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.289640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.289768 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.289885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.290008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfzg\" (UniqueName: \"kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.290154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.290321 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391812 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391916 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfzg\" (UniqueName: \"kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.391988 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.392035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.392860 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.393052 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.398211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.398298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.399145 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.400178 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.403489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.411828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfzg\" (UniqueName: \"kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg\") pod \"ceilometer-0\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.554943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.736130 4755 generic.go:334] "Generic (PLEG): container finished" podID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerID="389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73" exitCode=0 Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.736219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerDied","Data":"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73"} Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.765120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" event={"ID":"5e4b100e-d1f9-4bed-a11a-a6d3d593cc24","Type":"ContainerDied","Data":"5cfffcc8f3c81e05c522743b8635eacd659610417002ca39efbef8aefb666b5a"} Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.765161 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-tc5cr" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.765197 4755 scope.go:117] "RemoveContainer" containerID="bde12b463f3c2378a7a470cc022aa6a6180144213d828e3cd662be9efded7378" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.781203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2c1660df-89ac-403d-8343-195b26f04e5e","Type":"ContainerStarted","Data":"76343bd0873fa6317c21f66b4834fcf71a92972b1066f847df1b9941d27f68b2"} Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.782802 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.816137 4755 scope.go:117] "RemoveContainer" containerID="17e664427d57ea6f7d5fc4353c0b2d2a9b62c0a16f4031276879d9e6318cfe93" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.826391 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.826370863 podStartE2EDuration="3.826370863s" podCreationTimestamp="2025-10-06 09:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:11:27.807538851 +0000 UTC m=+2944.636854075" watchObservedRunningTime="2025-10-06 09:11:27.826370863 +0000 UTC m=+2944.655686067" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.848942 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.856996 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-tc5cr"] Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.889589 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae6bbc1-632c-4769-9fcf-b7689df07c49" path="/var/lib/kubelet/pods/1ae6bbc1-632c-4769-9fcf-b7689df07c49/volumes" Oct 06 09:11:27 crc kubenswrapper[4755]: I1006 09:11:27.890332 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4b100e-d1f9-4bed-a11a-a6d3d593cc24" path="/var/lib/kubelet/pods/5e4b100e-d1f9-4bed-a11a-a6d3d593cc24/volumes" Oct 06 09:11:28 crc kubenswrapper[4755]: I1006 09:11:28.041277 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:28 crc kubenswrapper[4755]: E1006 09:11:28.662239 4755 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/df8b2002a0451739b7d0ff94945078d75fd1f309a9268116e733cc34914c3d20/diff" to get inode usage: stat /var/lib/containers/storage/overlay/df8b2002a0451739b7d0ff94945078d75fd1f309a9268116e733cc34914c3d20/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_1ae6bbc1-632c-4769-9fcf-b7689df07c49/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_1ae6bbc1-632c-4769-9fcf-b7689df07c49/ceilometer-notification-agent/0.log: no such file or directory Oct 06 09:11:28 crc kubenswrapper[4755]: I1006 09:11:28.780924 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerStarted","Data":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} Oct 06 09:11:28 crc kubenswrapper[4755]: I1006 09:11:28.781255 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerStarted","Data":"f96e4062e1d4918c03204703f0f6458d834c13afab41b12b78bbd9c0d73db44d"} Oct 06 09:11:28 crc kubenswrapper[4755]: I1006 09:11:28.828171 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:28 crc kubenswrapper[4755]: I1006 09:11:28.929281 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 06 09:11:29 crc kubenswrapper[4755]: I1006 09:11:29.793251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerStarted","Data":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} Oct 06 09:11:30 crc kubenswrapper[4755]: I1006 09:11:30.803157 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerStarted","Data":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.832860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerStarted","Data":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.833622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.833377 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="proxy-httpd" containerID="cri-o://86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" gracePeriod=30 Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.833100 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-central-agent" containerID="cri-o://fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" gracePeriod=30 Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.833402 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-notification-agent" containerID="cri-o://548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" gracePeriod=30 Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.833392 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="sg-core" containerID="cri-o://d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" gracePeriod=30 Oct 06 09:11:33 crc kubenswrapper[4755]: I1006 09:11:33.863661 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.300796492 podStartE2EDuration="6.863643404s" podCreationTimestamp="2025-10-06 09:11:27 +0000 UTC" firstStartedPulling="2025-10-06 09:11:28.051950167 +0000 UTC m=+2944.881265381" lastFinishedPulling="2025-10-06 09:11:32.614797079 +0000 UTC m=+2949.444112293" observedRunningTime="2025-10-06 09:11:33.859330758 +0000 UTC m=+2950.688645972" watchObservedRunningTime="2025-10-06 09:11:33.863643404 +0000 UTC m=+2950.692958618" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.471747 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.550964 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551111 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551136 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpfzg\" (UniqueName: \"kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551278 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551349 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.551383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd\") pod \"7d3a88e1-4008-4374-bc89-f94cf5507b29\" (UID: \"7d3a88e1-4008-4374-bc89-f94cf5507b29\") " Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.552082 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.552559 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.557755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts" (OuterVolumeSpecName: "scripts") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.557795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg" (OuterVolumeSpecName: "kube-api-access-dpfzg") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "kube-api-access-dpfzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.579744 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.615337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.646263 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654635 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654709 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654724 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d3a88e1-4008-4374-bc89-f94cf5507b29-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654737 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654746 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654760 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpfzg\" (UniqueName: \"kubernetes.io/projected/7d3a88e1-4008-4374-bc89-f94cf5507b29-kube-api-access-dpfzg\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.654770 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.666862 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data" (OuterVolumeSpecName: "config-data") pod "7d3a88e1-4008-4374-bc89-f94cf5507b29" (UID: "7d3a88e1-4008-4374-bc89-f94cf5507b29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.757913 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3a88e1-4008-4374-bc89-f94cf5507b29-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846443 4755 generic.go:334] "Generic (PLEG): container finished" podID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" exitCode=0 Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846522 4755 generic.go:334] "Generic (PLEG): container finished" podID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" exitCode=2 Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846541 4755 generic.go:334] "Generic (PLEG): container finished" podID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" exitCode=0 Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846556 4755 generic.go:334] "Generic (PLEG): container finished" podID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" exitCode=0 Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846526 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerDied","Data":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerDied","Data":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerDied","Data":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerDied","Data":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d3a88e1-4008-4374-bc89-f94cf5507b29","Type":"ContainerDied","Data":"f96e4062e1d4918c03204703f0f6458d834c13afab41b12b78bbd9c0d73db44d"} Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.846818 4755 scope.go:117] "RemoveContainer" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.870005 4755 scope.go:117] "RemoveContainer" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.893365 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.897647 4755 scope.go:117] "RemoveContainer" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.903541 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913059 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:34 crc kubenswrapper[4755]: E1006 09:11:34.913451 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="proxy-httpd" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913469 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="proxy-httpd" Oct 06 09:11:34 crc kubenswrapper[4755]: E1006 09:11:34.913486 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-notification-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913493 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-notification-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: E1006 09:11:34.913504 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="sg-core" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913510 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="sg-core" Oct 06 09:11:34 crc kubenswrapper[4755]: E1006 09:11:34.913536 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-central-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913542 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-central-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913720 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="sg-core" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913739 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="proxy-httpd" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913749 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-notification-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.913772 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" containerName="ceilometer-central-agent" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.915288 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.925134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.925294 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.925885 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.926114 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:11:34 crc kubenswrapper[4755]: I1006 09:11:34.948952 4755 scope.go:117] "RemoveContainer" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.016188 4755 scope.go:117] "RemoveContainer" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:35 crc kubenswrapper[4755]: E1006 09:11:35.016866 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": container with ID starting with 86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca not found: ID does not exist" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.016914 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} err="failed to get container status \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": rpc error: code = NotFound desc = could not find container \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": container with ID starting with 86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.016953 4755 scope.go:117] "RemoveContainer" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:35 crc kubenswrapper[4755]: E1006 09:11:35.017241 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": container with ID starting with d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987 not found: ID does not exist" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.017264 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} err="failed to get container status \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": rpc error: code = NotFound desc = could not find container \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": container with ID starting with d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.017277 4755 scope.go:117] "RemoveContainer" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:35 crc kubenswrapper[4755]: E1006 09:11:35.017665 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": container with ID starting with 548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192 not found: ID does not exist" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.017712 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} err="failed to get container status \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": rpc error: code = NotFound desc = could not find container \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": container with ID starting with 548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.017732 4755 scope.go:117] "RemoveContainer" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: E1006 09:11:35.017977 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": container with ID starting with fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e not found: ID does not exist" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.017997 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} err="failed to get container status \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": rpc error: code = NotFound desc = could not find container \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": container with ID starting with fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018010 4755 scope.go:117] "RemoveContainer" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018237 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} err="failed to get container status \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": rpc error: code = NotFound desc = could not find container \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": container with ID starting with 86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018255 4755 scope.go:117] "RemoveContainer" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018637 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} err="failed to get container status \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": rpc error: code = NotFound desc = could not find container \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": container with ID starting with d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018658 4755 scope.go:117] "RemoveContainer" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018885 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} err="failed to get container status \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": rpc error: code = NotFound desc = could not find container \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": container with ID starting with 548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.018903 4755 scope.go:117] "RemoveContainer" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019286 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} err="failed to get container status \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": rpc error: code = NotFound desc = could not find container \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": container with ID starting with fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019305 4755 scope.go:117] "RemoveContainer" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019474 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} err="failed to get container status \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": rpc error: code = NotFound desc = could not find container \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": container with ID starting with 86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019493 4755 scope.go:117] "RemoveContainer" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019662 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} err="failed to get container status \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": rpc error: code = NotFound desc = could not find container \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": container with ID starting with d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019682 4755 scope.go:117] "RemoveContainer" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019888 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} err="failed to get container status \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": rpc error: code = NotFound desc = could not find container \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": container with ID starting with 548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.019908 4755 scope.go:117] "RemoveContainer" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020128 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} err="failed to get container status \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": rpc error: code = NotFound desc = could not find container \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": container with ID starting with fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020145 4755 scope.go:117] "RemoveContainer" containerID="86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020310 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca"} err="failed to get container status \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": rpc error: code = NotFound desc = could not find container \"86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca\": container with ID starting with 86af0ab404406ec6af531c1f44da7a40bc726cc1775c7aadc77e33432723abca not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020325 4755 scope.go:117] "RemoveContainer" containerID="d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020576 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987"} err="failed to get container status \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": rpc error: code = NotFound desc = could not find container \"d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987\": container with ID starting with d8762f89e6b742babf6a3999c16720cbe737f84b4e396b38cb8b8f834276a987 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020595 4755 scope.go:117] "RemoveContainer" containerID="548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020789 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192"} err="failed to get container status \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": rpc error: code = NotFound desc = could not find container \"548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192\": container with ID starting with 548b33285bc4635776394c709cad62ddc803ab8a371dbdad8abe77040ea94192 not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020807 4755 scope.go:117] "RemoveContainer" containerID="fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.020960 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e"} err="failed to get container status \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": rpc error: code = NotFound desc = could not find container \"fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e\": container with ID starting with fc1248a871d8b542869bf6a2345966c110ab38aa8bdc63d5ba20cf0b0392758e not found: ID does not exist" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063300 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-config-data\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-scripts\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.063411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplkf\" (UniqueName: \"kubernetes.io/projected/e5a8fee4-7b04-4487-b9d4-718640b217e4-kube-api-access-tplkf\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165455 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplkf\" (UniqueName: \"kubernetes.io/projected/e5a8fee4-7b04-4487-b9d4-718640b217e4-kube-api-access-tplkf\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165616 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165757 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165793 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-config-data\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.165817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-scripts\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.167405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.167890 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a8fee4-7b04-4487-b9d4-718640b217e4-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.169600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-scripts\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.172929 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-config-data\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.173812 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.177647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.179298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a8fee4-7b04-4487-b9d4-718640b217e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.183161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplkf\" (UniqueName: \"kubernetes.io/projected/e5a8fee4-7b04-4487-b9d4-718640b217e4-kube-api-access-tplkf\") pod \"ceilometer-0\" (UID: \"e5a8fee4-7b04-4487-b9d4-718640b217e4\") " pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.238790 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.694376 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.859438 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a8fee4-7b04-4487-b9d4-718640b217e4","Type":"ContainerStarted","Data":"786e0fe9db9da9da8706447bcbafaace7f4f4693c43fbb7a340b45e01dde2d2c"} Oct 06 09:11:35 crc kubenswrapper[4755]: I1006 09:11:35.891868 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3a88e1-4008-4374-bc89-f94cf5507b29" path="/var/lib/kubelet/pods/7d3a88e1-4008-4374-bc89-f94cf5507b29/volumes" Oct 06 09:11:36 crc kubenswrapper[4755]: I1006 09:11:36.884545 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a8fee4-7b04-4487-b9d4-718640b217e4","Type":"ContainerStarted","Data":"eac547499009ac5872e110c849c63b6ee88153a4cac7709cb3422867d088960f"} Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.566904 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.612417 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.900867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.904367 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="manila-scheduler" containerID="cri-o://fd8a6fc4c174ff4e8bab88005a022489b1c6421dde4c5ee907aa47883c67f4ef" gracePeriod=30 Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.904653 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a8fee4-7b04-4487-b9d4-718640b217e4","Type":"ContainerStarted","Data":"6ce59a3b60dbc056600fe40015476a6cca977d189be925506a3140627e0056d1"} Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.904684 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a8fee4-7b04-4487-b9d4-718640b217e4","Type":"ContainerStarted","Data":"b4e60f573a98d03f6a1f7de182d38e093141ad04a704966966b340012615e501"} Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.904719 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="probe" containerID="cri-o://ab12b21a86beebceabbed399d042518eb1587cac5de069a48abca79b2a75b5e1" gracePeriod=30 Oct 06 09:11:37 crc kubenswrapper[4755]: I1006 09:11:37.976010 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:38 crc kubenswrapper[4755]: I1006 09:11:38.918859 4755 generic.go:334] "Generic (PLEG): container finished" podID="004e482c-7cdb-4416-91d9-d0c43641625d" containerID="ab12b21a86beebceabbed399d042518eb1587cac5de069a48abca79b2a75b5e1" exitCode=0 Oct 06 09:11:38 crc kubenswrapper[4755]: I1006 09:11:38.919105 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="manila-share" containerID="cri-o://e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba" gracePeriod=30 Oct 06 09:11:38 crc kubenswrapper[4755]: I1006 09:11:38.919352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerDied","Data":"ab12b21a86beebceabbed399d042518eb1587cac5de069a48abca79b2a75b5e1"} Oct 06 09:11:38 crc kubenswrapper[4755]: I1006 09:11:38.919706 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="probe" containerID="cri-o://8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852" gracePeriod=30 Oct 06 09:11:38 crc kubenswrapper[4755]: I1006 09:11:38.931237 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 06 09:11:39 crc kubenswrapper[4755]: E1006 09:11:39.457093 4755 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/41971c62fb509baa4efdf35ba4e5c81b93d44f7c86f97ae02870a02b778e8c59/diff" to get inode usage: stat /var/lib/containers/storage/overlay/41971c62fb509baa4efdf35ba4e5c81b93d44f7c86f97ae02870a02b778e8c59/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-864d5fc68c-tc5cr_5e4b100e-d1f9-4bed-a11a-a6d3d593cc24/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-864d5fc68c-tc5cr_5e4b100e-d1f9-4bed-a11a-a6d3d593cc24/dnsmasq-dns/0.log: no such file or directory Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.933426 4755 generic.go:334] "Generic (PLEG): container finished" podID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerID="8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852" exitCode=0 Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.933663 4755 generic.go:334] "Generic (PLEG): container finished" podID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerID="e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba" exitCode=1 Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.933706 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerDied","Data":"8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852"} Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.933732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerDied","Data":"e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba"} Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.936684 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a8fee4-7b04-4487-b9d4-718640b217e4","Type":"ContainerStarted","Data":"c691a0ebe0dfb82e9112558e36d0045ad914700282403a5469e03cc3b99cd8e1"} Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.936832 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:11:39 crc kubenswrapper[4755]: I1006 09:11:39.962882 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8362090540000002 podStartE2EDuration="5.962860424s" podCreationTimestamp="2025-10-06 09:11:34 +0000 UTC" firstStartedPulling="2025-10-06 09:11:35.70111083 +0000 UTC m=+2952.530426044" lastFinishedPulling="2025-10-06 09:11:38.82776219 +0000 UTC m=+2955.657077414" observedRunningTime="2025-10-06 09:11:39.955633506 +0000 UTC m=+2956.784948720" watchObservedRunningTime="2025-10-06 09:11:39.962860424 +0000 UTC m=+2956.792175638" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.046119 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174650 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174676 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174698 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174781 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.174956 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.175061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69r7l\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.175168 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle\") pod \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\" (UID: \"21a73e92-20ea-4a05-8c43-e84b9e0bb15d\") " Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.175785 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.176093 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.176121 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.180853 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts" (OuterVolumeSpecName: "scripts") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.181544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.181835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l" (OuterVolumeSpecName: "kube-api-access-69r7l") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "kube-api-access-69r7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.182114 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph" (OuterVolumeSpecName: "ceph") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.238083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.277715 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.277751 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.277764 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.277777 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.277791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69r7l\" (UniqueName: \"kubernetes.io/projected/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-kube-api-access-69r7l\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.278371 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data" (OuterVolumeSpecName: "config-data") pod "21a73e92-20ea-4a05-8c43-e84b9e0bb15d" (UID: "21a73e92-20ea-4a05-8c43-e84b9e0bb15d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.379631 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a73e92-20ea-4a05-8c43-e84b9e0bb15d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.948380 4755 generic.go:334] "Generic (PLEG): container finished" podID="004e482c-7cdb-4416-91d9-d0c43641625d" containerID="fd8a6fc4c174ff4e8bab88005a022489b1c6421dde4c5ee907aa47883c67f4ef" exitCode=0 Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.948446 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerDied","Data":"fd8a6fc4c174ff4e8bab88005a022489b1c6421dde4c5ee907aa47883c67f4ef"} Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.951322 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.951671 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"21a73e92-20ea-4a05-8c43-e84b9e0bb15d","Type":"ContainerDied","Data":"e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2"} Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.951719 4755 scope.go:117] "RemoveContainer" containerID="8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852" Oct 06 09:11:40 crc kubenswrapper[4755]: I1006 09:11:40.981407 4755 scope.go:117] "RemoveContainer" containerID="e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.001915 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.023762 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.046925 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:41 crc kubenswrapper[4755]: E1006 09:11:41.047600 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="probe" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.047619 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="probe" Oct 06 09:11:41 crc kubenswrapper[4755]: E1006 09:11:41.047672 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="manila-share" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.047682 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="manila-share" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.047865 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="probe" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.047883 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" containerName="manila-share" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.049398 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.061135 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.076513 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.201413 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206245 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-scripts\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206315 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206399 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-ceph\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206780 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.206877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s58p9\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-kube-api-access-s58p9\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.207183 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308190 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308355 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5nks\" (UniqueName: \"kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308483 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts\") pod \"004e482c-7cdb-4416-91d9-d0c43641625d\" (UID: \"004e482c-7cdb-4416-91d9-d0c43641625d\") " Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-ceph\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s58p9\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-kube-api-access-s58p9\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.308972 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-scripts\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.309003 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.309016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.309335 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/004e482c-7cdb-4416-91d9-d0c43641625d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.310799 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/400eacf6-c350-4499-b95f-1e8ad5b09dab-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.314159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-ceph\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.314367 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts" (OuterVolumeSpecName: "scripts") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.315811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.316032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-scripts\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.316395 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.317829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks" (OuterVolumeSpecName: "kube-api-access-p5nks") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "kube-api-access-p5nks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.323555 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.325315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/400eacf6-c350-4499-b95f-1e8ad5b09dab-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.329230 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s58p9\" (UniqueName: \"kubernetes.io/projected/400eacf6-c350-4499-b95f-1e8ad5b09dab-kube-api-access-s58p9\") pod \"manila-share-share1-0\" (UID: \"400eacf6-c350-4499-b95f-1e8ad5b09dab\") " pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.383457 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.387323 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.412153 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.412651 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.412670 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5nks\" (UniqueName: \"kubernetes.io/projected/004e482c-7cdb-4416-91d9-d0c43641625d-kube-api-access-p5nks\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.412683 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.435625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data" (OuterVolumeSpecName: "config-data") pod "004e482c-7cdb-4416-91d9-d0c43641625d" (UID: "004e482c-7cdb-4416-91d9-d0c43641625d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.515851 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004e482c-7cdb-4416-91d9-d0c43641625d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.891409 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a73e92-20ea-4a05-8c43-e84b9e0bb15d" path="/var/lib/kubelet/pods/21a73e92-20ea-4a05-8c43-e84b9e0bb15d/volumes" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.900255 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.972039 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"004e482c-7cdb-4416-91d9-d0c43641625d","Type":"ContainerDied","Data":"6f01f52603cfb7a35a78dabd7b855324cb840f88e0b4c153e53590bfb0fa3080"} Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.972504 4755 scope.go:117] "RemoveContainer" containerID="ab12b21a86beebceabbed399d042518eb1587cac5de069a48abca79b2a75b5e1" Oct 06 09:11:41 crc kubenswrapper[4755]: I1006 09:11:41.972684 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.062679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"400eacf6-c350-4499-b95f-1e8ad5b09dab","Type":"ContainerStarted","Data":"f3e733f3e2f07945db6021d1c0cd3e5132f469c9d62a15e2e9c72253ed6b9f52"} Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.107469 4755 scope.go:117] "RemoveContainer" containerID="fd8a6fc4c174ff4e8bab88005a022489b1c6421dde4c5ee907aa47883c67f4ef" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.113324 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.125726 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.182646 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:42 crc kubenswrapper[4755]: E1006 09:11:42.183390 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="manila-scheduler" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.183414 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="manila-scheduler" Oct 06 09:11:42 crc kubenswrapper[4755]: E1006 09:11:42.183445 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="probe" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.183454 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="probe" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.183875 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="probe" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.183906 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" containerName="manila-scheduler" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.185955 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.190169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.207420 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.364638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.365223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9874a772-0d86-4316-9512-139b7b140518-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.365286 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-scripts\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.365362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.365432 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.365500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n28p\" (UniqueName: \"kubernetes.io/projected/9874a772-0d86-4316-9512-139b7b140518-kube-api-access-5n28p\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.466701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.466788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.466844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n28p\" (UniqueName: \"kubernetes.io/projected/9874a772-0d86-4316-9512-139b7b140518-kube-api-access-5n28p\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.466885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.466982 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9874a772-0d86-4316-9512-139b7b140518-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.467043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-scripts\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.467087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9874a772-0d86-4316-9512-139b7b140518-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.470105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.470362 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.470589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.470732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9874a772-0d86-4316-9512-139b7b140518-scripts\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.493186 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n28p\" (UniqueName: \"kubernetes.io/projected/9874a772-0d86-4316-9512-139b7b140518-kube-api-access-5n28p\") pod \"manila-scheduler-0\" (UID: \"9874a772-0d86-4316-9512-139b7b140518\") " pod="openstack/manila-scheduler-0" Oct 06 09:11:42 crc kubenswrapper[4755]: I1006 09:11:42.520315 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 09:11:43 crc kubenswrapper[4755]: I1006 09:11:43.075715 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"400eacf6-c350-4499-b95f-1e8ad5b09dab","Type":"ContainerStarted","Data":"f300d99f1140b5379eb5fc0b90a929c256d9994360fd5d5a8417fc4fc1b5f535"} Oct 06 09:11:43 crc kubenswrapper[4755]: I1006 09:11:43.222772 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 09:11:43 crc kubenswrapper[4755]: I1006 09:11:43.895797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="004e482c-7cdb-4416-91d9-d0c43641625d" path="/var/lib/kubelet/pods/004e482c-7cdb-4416-91d9-d0c43641625d/volumes" Oct 06 09:11:44 crc kubenswrapper[4755]: I1006 09:11:44.092291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"400eacf6-c350-4499-b95f-1e8ad5b09dab","Type":"ContainerStarted","Data":"2dd5148fbfb5211a1905771fafb3bcd923e55296911de16d326e0ae68e7cb5bf"} Oct 06 09:11:44 crc kubenswrapper[4755]: I1006 09:11:44.095312 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9874a772-0d86-4316-9512-139b7b140518","Type":"ContainerStarted","Data":"cb363b2b38206e90e6cc54987b22d17abcb3adaa990bbd63be29f3cb7c1eee85"} Oct 06 09:11:44 crc kubenswrapper[4755]: I1006 09:11:44.095367 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9874a772-0d86-4316-9512-139b7b140518","Type":"ContainerStarted","Data":"e0df49ffba2d37f6339097a7bdfd4cd09ae84a208cc0b8d2e4f014ef89030865"} Oct 06 09:11:44 crc kubenswrapper[4755]: I1006 09:11:44.119851 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.119814369 podStartE2EDuration="4.119814369s" podCreationTimestamp="2025-10-06 09:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:11:44.114891478 +0000 UTC m=+2960.944206702" watchObservedRunningTime="2025-10-06 09:11:44.119814369 +0000 UTC m=+2960.949129573" Oct 06 09:11:45 crc kubenswrapper[4755]: I1006 09:11:45.109521 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9874a772-0d86-4316-9512-139b7b140518","Type":"ContainerStarted","Data":"51d3bc86ede9aaed95a7b4791f22611e077296dc9e5445e88c349e7b824c3fed"} Oct 06 09:11:45 crc kubenswrapper[4755]: I1006 09:11:45.137451 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.137411652 podStartE2EDuration="3.137411652s" podCreationTimestamp="2025-10-06 09:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:11:45.133751022 +0000 UTC m=+2961.963066256" watchObservedRunningTime="2025-10-06 09:11:45.137411652 +0000 UTC m=+2961.966726866" Oct 06 09:11:46 crc kubenswrapper[4755]: I1006 09:11:46.904049 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 06 09:11:48 crc kubenswrapper[4755]: I1006 09:11:48.929756 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77dcf6c7d-x7gnh" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.245:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.245:8443: connect: connection refused" Oct 06 09:11:48 crc kubenswrapper[4755]: I1006 09:11:48.930423 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:11:51 crc kubenswrapper[4755]: I1006 09:11:51.389067 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 06 09:11:52 crc kubenswrapper[4755]: I1006 09:11:52.521734 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 06 09:11:53 crc kubenswrapper[4755]: W1006 09:11:53.662069 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a73e92_20ea_4a05_8c43_e84b9e0bb15d.slice/crio-e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba.scope WatchSource:0}: Error finding container e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba: Status 404 returned error can't find the container with id e1bc7238c6c6c5d875957d93de50e1f0ab2679a64041939c13dd4208649984ba Oct 06 09:11:53 crc kubenswrapper[4755]: E1006 09:11:53.665739 4755 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a73e92_20ea_4a05_8c43_e84b9e0bb15d.slice/crio-e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2: Error finding container e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2: Status 404 returned error can't find the container with id e19f0958cc7d469854afa55ccee533eef9d12b4cacd44026e0aa9b98206c32c2 Oct 06 09:11:53 crc kubenswrapper[4755]: W1006 09:11:53.665858 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3a88e1_4008_4374_bc89_f94cf5507b29.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3a88e1_4008_4374_bc89_f94cf5507b29.slice: no such file or directory Oct 06 09:11:53 crc kubenswrapper[4755]: W1006 09:11:53.666052 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a73e92_20ea_4a05_8c43_e84b9e0bb15d.slice/crio-8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852.scope WatchSource:0}: Error finding container 8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852: Status 404 returned error can't find the container with id 8e4e83da443ff2c36ddd03f7f31188971758ad9c2c0f4c3baeac5da72e2d6852 Oct 06 09:11:53 crc kubenswrapper[4755]: E1006 09:11:53.902964 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod004e482c_7cdb_4416_91d9_d0c43641625d.slice/crio-6f01f52603cfb7a35a78dabd7b855324cb840f88e0b4c153e53590bfb0fa3080\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e61052_105b_4bd0_8056_8a29dec9fcfe.slice/crio-conmon-75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod004e482c_7cdb_4416_91d9_d0c43641625d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e61052_105b_4bd0_8056_8a29dec9fcfe.slice/crio-75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.049968 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.200373 4755 generic.go:334] "Generic (PLEG): container finished" podID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerID="75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb" exitCode=137 Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.200434 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerDied","Data":"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb"} Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.200468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77dcf6c7d-x7gnh" event={"ID":"44e61052-105b-4bd0-8056-8a29dec9fcfe","Type":"ContainerDied","Data":"7a65f606511138dddf084201ea24a9eca35b6fc301c117541b4aefa75bb56ae0"} Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.200473 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77dcf6c7d-x7gnh" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.200492 4755 scope.go:117] "RemoveContainer" containerID="389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.234538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.234812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9bbr\" (UniqueName: \"kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.234924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.234992 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.235088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.235189 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.235398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key\") pod \"44e61052-105b-4bd0-8056-8a29dec9fcfe\" (UID: \"44e61052-105b-4bd0-8056-8a29dec9fcfe\") " Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.238794 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs" (OuterVolumeSpecName: "logs") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.241902 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.244953 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr" (OuterVolumeSpecName: "kube-api-access-x9bbr") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "kube-api-access-x9bbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.267175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.276910 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts" (OuterVolumeSpecName: "scripts") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.289465 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data" (OuterVolumeSpecName: "config-data") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.292159 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "44e61052-105b-4bd0-8056-8a29dec9fcfe" (UID: "44e61052-105b-4bd0-8056-8a29dec9fcfe"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337747 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337784 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337795 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9bbr\" (UniqueName: \"kubernetes.io/projected/44e61052-105b-4bd0-8056-8a29dec9fcfe-kube-api-access-x9bbr\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337806 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/44e61052-105b-4bd0-8056-8a29dec9fcfe-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337815 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337824 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44e61052-105b-4bd0-8056-8a29dec9fcfe-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.337832 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44e61052-105b-4bd0-8056-8a29dec9fcfe-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.399844 4755 scope.go:117] "RemoveContainer" containerID="75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.422197 4755 scope.go:117] "RemoveContainer" containerID="389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73" Oct 06 09:11:54 crc kubenswrapper[4755]: E1006 09:11:54.422927 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73\": container with ID starting with 389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73 not found: ID does not exist" containerID="389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.423148 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73"} err="failed to get container status \"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73\": rpc error: code = NotFound desc = could not find container \"389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73\": container with ID starting with 389df74a08b498ecdb20ff74a17163729f48bb649e4cea0823c4718993d15c73 not found: ID does not exist" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.423320 4755 scope.go:117] "RemoveContainer" containerID="75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb" Oct 06 09:11:54 crc kubenswrapper[4755]: E1006 09:11:54.423883 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb\": container with ID starting with 75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb not found: ID does not exist" containerID="75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.423921 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb"} err="failed to get container status \"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb\": rpc error: code = NotFound desc = could not find container \"75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb\": container with ID starting with 75294bf94e0165e859bec14b230623e8d87bdd360e4e162512e4c289dbe7d7bb not found: ID does not exist" Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.537849 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:11:54 crc kubenswrapper[4755]: I1006 09:11:54.569266 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77dcf6c7d-x7gnh"] Oct 06 09:11:55 crc kubenswrapper[4755]: I1006 09:11:55.898992 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" path="/var/lib/kubelet/pods/44e61052-105b-4bd0-8056-8a29dec9fcfe/volumes" Oct 06 09:12:02 crc kubenswrapper[4755]: I1006 09:12:02.846588 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 06 09:12:04 crc kubenswrapper[4755]: I1006 09:12:04.116555 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 06 09:12:05 crc kubenswrapper[4755]: I1006 09:12:05.246260 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.917360 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:02 crc kubenswrapper[4755]: E1006 09:13:02.919078 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon-log" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.919111 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon-log" Oct 06 09:13:02 crc kubenswrapper[4755]: E1006 09:13:02.919212 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.919228 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.919643 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon-log" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.919672 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e61052-105b-4bd0-8056-8a29dec9fcfe" containerName="horizon" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.922380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:02 crc kubenswrapper[4755]: I1006 09:13:02.933732 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.019811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.020432 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.020515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hbpp\" (UniqueName: \"kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.123016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.123100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.123154 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hbpp\" (UniqueName: \"kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.123469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.123660 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.142698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hbpp\" (UniqueName: \"kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp\") pod \"redhat-operators-kln2n\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.254820 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.732513 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:03 crc kubenswrapper[4755]: I1006 09:13:03.920181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerStarted","Data":"8b86922e2809b8cbf35919fa0f687ff1f0f55ea56a58b5cf3c15edf366921369"} Oct 06 09:13:04 crc kubenswrapper[4755]: I1006 09:13:04.938246 4755 generic.go:334] "Generic (PLEG): container finished" podID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerID="49922217feae799d1eed2c3953e3051a0f87d37cf0a245205d8988da0c9ee0f4" exitCode=0 Oct 06 09:13:04 crc kubenswrapper[4755]: I1006 09:13:04.938473 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerDied","Data":"49922217feae799d1eed2c3953e3051a0f87d37cf0a245205d8988da0c9ee0f4"} Oct 06 09:13:04 crc kubenswrapper[4755]: I1006 09:13:04.945502 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:13:05 crc kubenswrapper[4755]: I1006 09:13:05.950370 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerStarted","Data":"f68278759c6ffb8dc32bb54b9b7be86632bab1950258983dc8c099d76432121b"} Oct 06 09:13:06 crc kubenswrapper[4755]: I1006 09:13:06.968283 4755 generic.go:334] "Generic (PLEG): container finished" podID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerID="f68278759c6ffb8dc32bb54b9b7be86632bab1950258983dc8c099d76432121b" exitCode=0 Oct 06 09:13:06 crc kubenswrapper[4755]: I1006 09:13:06.968388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerDied","Data":"f68278759c6ffb8dc32bb54b9b7be86632bab1950258983dc8c099d76432121b"} Oct 06 09:13:07 crc kubenswrapper[4755]: I1006 09:13:07.981830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerStarted","Data":"f901d50affca6ab84beb66696018d2fda6f78f21d107a32ed2a94192ae5d19d8"} Oct 06 09:13:08 crc kubenswrapper[4755]: I1006 09:13:08.004751 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kln2n" podStartSLOduration=3.376827693 podStartE2EDuration="6.004730669s" podCreationTimestamp="2025-10-06 09:13:02 +0000 UTC" firstStartedPulling="2025-10-06 09:13:04.945019523 +0000 UTC m=+3041.774334777" lastFinishedPulling="2025-10-06 09:13:07.572922539 +0000 UTC m=+3044.402237753" observedRunningTime="2025-10-06 09:13:08.002543857 +0000 UTC m=+3044.831859091" watchObservedRunningTime="2025-10-06 09:13:08.004730669 +0000 UTC m=+3044.834045883" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.545849 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.547428 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.549351 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.549498 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-twssm" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.549760 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.550982 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.565046 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.709901 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.709971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710059 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710155 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710276 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqmx\" (UniqueName: \"kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.710508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqmx\" (UniqueName: \"kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812644 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.812781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.813405 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.813589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.816022 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.816959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.818281 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.821446 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.827156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.832597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.833479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqmx\" (UniqueName: \"kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.857099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " pod="openstack/tempest-tests-tempest" Oct 06 09:13:11 crc kubenswrapper[4755]: I1006 09:13:11.873897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:13:12 crc kubenswrapper[4755]: I1006 09:13:12.330036 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:13:13 crc kubenswrapper[4755]: I1006 09:13:13.027265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d5b996de-cff3-4a46-bfd0-25e7833f58e8","Type":"ContainerStarted","Data":"a348eca4ce0eee37520be800016196aac2e5be8a2b911dab3b9e3479bd657a7d"} Oct 06 09:13:13 crc kubenswrapper[4755]: I1006 09:13:13.256052 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:13 crc kubenswrapper[4755]: I1006 09:13:13.256408 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:13 crc kubenswrapper[4755]: I1006 09:13:13.310867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:14 crc kubenswrapper[4755]: I1006 09:13:14.091408 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:14 crc kubenswrapper[4755]: I1006 09:13:14.143229 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:16 crc kubenswrapper[4755]: I1006 09:13:16.058370 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kln2n" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="registry-server" containerID="cri-o://f901d50affca6ab84beb66696018d2fda6f78f21d107a32ed2a94192ae5d19d8" gracePeriod=2 Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.080205 4755 generic.go:334] "Generic (PLEG): container finished" podID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerID="f901d50affca6ab84beb66696018d2fda6f78f21d107a32ed2a94192ae5d19d8" exitCode=0 Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.080293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerDied","Data":"f901d50affca6ab84beb66696018d2fda6f78f21d107a32ed2a94192ae5d19d8"} Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.870606 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.944333 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities\") pod \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.944601 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content\") pod \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.944716 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hbpp\" (UniqueName: \"kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp\") pod \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\" (UID: \"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368\") " Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.945177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities" (OuterVolumeSpecName: "utilities") pod "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" (UID: "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.945380 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:17 crc kubenswrapper[4755]: I1006 09:13:17.950375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp" (OuterVolumeSpecName: "kube-api-access-8hbpp") pod "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" (UID: "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368"). InnerVolumeSpecName "kube-api-access-8hbpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.008680 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" (UID: "71ecc980-b3d4-4dfe-bc49-73eaf5bf4368"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.047766 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hbpp\" (UniqueName: \"kubernetes.io/projected/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-kube-api-access-8hbpp\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.047819 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.091134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kln2n" event={"ID":"71ecc980-b3d4-4dfe-bc49-73eaf5bf4368","Type":"ContainerDied","Data":"8b86922e2809b8cbf35919fa0f687ff1f0f55ea56a58b5cf3c15edf366921369"} Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.091200 4755 scope.go:117] "RemoveContainer" containerID="f901d50affca6ab84beb66696018d2fda6f78f21d107a32ed2a94192ae5d19d8" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.092339 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kln2n" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.122697 4755 scope.go:117] "RemoveContainer" containerID="f68278759c6ffb8dc32bb54b9b7be86632bab1950258983dc8c099d76432121b" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.125366 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.133841 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kln2n"] Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.151618 4755 scope.go:117] "RemoveContainer" containerID="49922217feae799d1eed2c3953e3051a0f87d37cf0a245205d8988da0c9ee0f4" Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.912272 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:13:18 crc kubenswrapper[4755]: I1006 09:13:18.912380 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:13:19 crc kubenswrapper[4755]: I1006 09:13:19.892672 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" path="/var/lib/kubelet/pods/71ecc980-b3d4-4dfe-bc49-73eaf5bf4368/volumes" Oct 06 09:13:41 crc kubenswrapper[4755]: E1006 09:13:41.978624 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 06 09:13:41 crc kubenswrapper[4755]: E1006 09:13:41.980769 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jqmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d5b996de-cff3-4a46-bfd0-25e7833f58e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:13:41 crc kubenswrapper[4755]: E1006 09:13:41.982216 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" Oct 06 09:13:42 crc kubenswrapper[4755]: E1006 09:13:42.353858 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" Oct 06 09:13:48 crc kubenswrapper[4755]: I1006 09:13:48.912165 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:13:48 crc kubenswrapper[4755]: I1006 09:13:48.912754 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:13:56 crc kubenswrapper[4755]: I1006 09:13:56.332012 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 09:13:57 crc kubenswrapper[4755]: I1006 09:13:57.498431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d5b996de-cff3-4a46-bfd0-25e7833f58e8","Type":"ContainerStarted","Data":"de8fd9e6a3d31077d285d1c560ab461055c4d19edb2a5a64dee6f19ce074861a"} Oct 06 09:13:57 crc kubenswrapper[4755]: I1006 09:13:57.522124 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.527612349 podStartE2EDuration="47.522077571s" podCreationTimestamp="2025-10-06 09:13:10 +0000 UTC" firstStartedPulling="2025-10-06 09:13:12.333623981 +0000 UTC m=+3049.162939195" lastFinishedPulling="2025-10-06 09:13:56.328089203 +0000 UTC m=+3093.157404417" observedRunningTime="2025-10-06 09:13:57.514424365 +0000 UTC m=+3094.343739589" watchObservedRunningTime="2025-10-06 09:13:57.522077571 +0000 UTC m=+3094.351392785" Oct 06 09:14:18 crc kubenswrapper[4755]: I1006 09:14:18.912331 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:14:18 crc kubenswrapper[4755]: I1006 09:14:18.912949 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:14:18 crc kubenswrapper[4755]: I1006 09:14:18.912995 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:14:18 crc kubenswrapper[4755]: I1006 09:14:18.913470 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:14:18 crc kubenswrapper[4755]: I1006 09:14:18.913518 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" gracePeriod=600 Oct 06 09:14:19 crc kubenswrapper[4755]: E1006 09:14:19.049057 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:14:19 crc kubenswrapper[4755]: I1006 09:14:19.731390 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" exitCode=0 Oct 06 09:14:19 crc kubenswrapper[4755]: I1006 09:14:19.731462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20"} Oct 06 09:14:19 crc kubenswrapper[4755]: I1006 09:14:19.731750 4755 scope.go:117] "RemoveContainer" containerID="192f4452ee1012132588e7317f9d9bfb58ff59e73705bc43b48bde85c4a0e20f" Oct 06 09:14:19 crc kubenswrapper[4755]: I1006 09:14:19.733416 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:14:19 crc kubenswrapper[4755]: E1006 09:14:19.733986 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:14:31 crc kubenswrapper[4755]: I1006 09:14:31.878199 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:14:31 crc kubenswrapper[4755]: E1006 09:14:31.879155 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:14:44 crc kubenswrapper[4755]: I1006 09:14:44.878490 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:14:44 crc kubenswrapper[4755]: E1006 09:14:44.879669 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:14:56 crc kubenswrapper[4755]: I1006 09:14:56.878780 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:14:56 crc kubenswrapper[4755]: E1006 09:14:56.879486 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.167531 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t"] Oct 06 09:15:00 crc kubenswrapper[4755]: E1006 09:15:00.168732 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="extract-content" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.168752 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="extract-content" Oct 06 09:15:00 crc kubenswrapper[4755]: E1006 09:15:00.168784 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="extract-utilities" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.168791 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="extract-utilities" Oct 06 09:15:00 crc kubenswrapper[4755]: E1006 09:15:00.168815 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="registry-server" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.168822 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="registry-server" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.169070 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ecc980-b3d4-4dfe-bc49-73eaf5bf4368" containerName="registry-server" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.169789 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.171872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.178349 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.190743 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t"] Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.249744 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.249802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.249923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22q5\" (UniqueName: \"kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.351750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.351855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22q5\" (UniqueName: \"kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.352036 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.353914 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.360338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.375197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22q5\" (UniqueName: \"kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5\") pod \"collect-profiles-29329035-sbs7t\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.488239 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:00 crc kubenswrapper[4755]: I1006 09:15:00.925615 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t"] Oct 06 09:15:01 crc kubenswrapper[4755]: I1006 09:15:01.120079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" event={"ID":"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f","Type":"ContainerStarted","Data":"5d32e81d80502873bef85370fcf5668f081c689bf4fed5c8508697de9a1293fb"} Oct 06 09:15:02 crc kubenswrapper[4755]: I1006 09:15:02.130705 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" containerID="84eaf771f7428a5dee67afe8bdc41723543841d54642a1df005a1c9a8ffb22fe" exitCode=0 Oct 06 09:15:02 crc kubenswrapper[4755]: I1006 09:15:02.131035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" event={"ID":"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f","Type":"ContainerDied","Data":"84eaf771f7428a5dee67afe8bdc41723543841d54642a1df005a1c9a8ffb22fe"} Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.547386 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.739020 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22q5\" (UniqueName: \"kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5\") pod \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.739279 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume\") pod \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.739341 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume\") pod \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\" (UID: \"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f\") " Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.740856 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" (UID: "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.746861 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" (UID: "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.747978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5" (OuterVolumeSpecName: "kube-api-access-j22q5") pod "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" (UID: "e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f"). InnerVolumeSpecName "kube-api-access-j22q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.842451 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.842493 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:03 crc kubenswrapper[4755]: I1006 09:15:03.842523 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22q5\" (UniqueName: \"kubernetes.io/projected/e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f-kube-api-access-j22q5\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:04 crc kubenswrapper[4755]: I1006 09:15:04.152009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" event={"ID":"e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f","Type":"ContainerDied","Data":"5d32e81d80502873bef85370fcf5668f081c689bf4fed5c8508697de9a1293fb"} Oct 06 09:15:04 crc kubenswrapper[4755]: I1006 09:15:04.152053 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d32e81d80502873bef85370fcf5668f081c689bf4fed5c8508697de9a1293fb" Oct 06 09:15:04 crc kubenswrapper[4755]: I1006 09:15:04.152082 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-sbs7t" Oct 06 09:15:04 crc kubenswrapper[4755]: I1006 09:15:04.626674 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t"] Oct 06 09:15:04 crc kubenswrapper[4755]: I1006 09:15:04.636944 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-nf79t"] Oct 06 09:15:05 crc kubenswrapper[4755]: I1006 09:15:05.898634 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cac01e8-f35b-4518-97e6-358a42a2620d" path="/var/lib/kubelet/pods/8cac01e8-f35b-4518-97e6-358a42a2620d/volumes" Oct 06 09:15:07 crc kubenswrapper[4755]: I1006 09:15:07.878419 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:15:07 crc kubenswrapper[4755]: E1006 09:15:07.879113 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:15:18 crc kubenswrapper[4755]: I1006 09:15:18.879154 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:15:18 crc kubenswrapper[4755]: E1006 09:15:18.879917 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:15:33 crc kubenswrapper[4755]: I1006 09:15:33.886424 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:15:33 crc kubenswrapper[4755]: E1006 09:15:33.888390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:15:41 crc kubenswrapper[4755]: I1006 09:15:41.945841 4755 scope.go:117] "RemoveContainer" containerID="a8078a8b193b4df14ea04badc5890279fac83331bdd5c79aedf7387d2e60bbb5" Oct 06 09:15:44 crc kubenswrapper[4755]: I1006 09:15:44.878971 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:15:44 crc kubenswrapper[4755]: E1006 09:15:44.881067 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:15:56 crc kubenswrapper[4755]: I1006 09:15:56.879164 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:15:56 crc kubenswrapper[4755]: E1006 09:15:56.879869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:16:10 crc kubenswrapper[4755]: I1006 09:16:10.878993 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:16:10 crc kubenswrapper[4755]: E1006 09:16:10.882959 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:16:24 crc kubenswrapper[4755]: I1006 09:16:24.879098 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:16:24 crc kubenswrapper[4755]: E1006 09:16:24.880028 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:16:39 crc kubenswrapper[4755]: I1006 09:16:39.879119 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:16:39 crc kubenswrapper[4755]: E1006 09:16:39.880071 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:16:52 crc kubenswrapper[4755]: I1006 09:16:52.879785 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:16:52 crc kubenswrapper[4755]: E1006 09:16:52.880540 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:17:05 crc kubenswrapper[4755]: I1006 09:17:05.878969 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:17:05 crc kubenswrapper[4755]: E1006 09:17:05.879778 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:17:20 crc kubenswrapper[4755]: I1006 09:17:20.878959 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:17:20 crc kubenswrapper[4755]: E1006 09:17:20.879694 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:17:31 crc kubenswrapper[4755]: I1006 09:17:31.879168 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:17:31 crc kubenswrapper[4755]: E1006 09:17:31.880322 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:17:43 crc kubenswrapper[4755]: I1006 09:17:43.893352 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:17:43 crc kubenswrapper[4755]: E1006 09:17:43.894680 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:17:54 crc kubenswrapper[4755]: I1006 09:17:54.878736 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:17:54 crc kubenswrapper[4755]: E1006 09:17:54.879638 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:18:05 crc kubenswrapper[4755]: I1006 09:18:05.879710 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:18:05 crc kubenswrapper[4755]: E1006 09:18:05.880698 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:18:20 crc kubenswrapper[4755]: I1006 09:18:20.879451 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:18:20 crc kubenswrapper[4755]: E1006 09:18:20.880415 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:18:34 crc kubenswrapper[4755]: I1006 09:18:34.879167 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:18:34 crc kubenswrapper[4755]: E1006 09:18:34.880159 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:18:45 crc kubenswrapper[4755]: I1006 09:18:45.879770 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:18:45 crc kubenswrapper[4755]: E1006 09:18:45.880773 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:19:00 crc kubenswrapper[4755]: I1006 09:19:00.879126 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:19:00 crc kubenswrapper[4755]: E1006 09:19:00.879943 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:19:14 crc kubenswrapper[4755]: I1006 09:19:14.878971 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:19:14 crc kubenswrapper[4755]: E1006 09:19:14.880372 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:19:26 crc kubenswrapper[4755]: I1006 09:19:26.879313 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:19:27 crc kubenswrapper[4755]: I1006 09:19:27.580903 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a"} Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.244981 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:01 crc kubenswrapper[4755]: E1006 09:20:01.245832 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" containerName="collect-profiles" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.245846 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" containerName="collect-profiles" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.246079 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e5e2bd-f2bc-4a86-9a85-1ae9887e058f" containerName="collect-profiles" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.247505 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.256593 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.343050 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.343106 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.343215 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qv6h\" (UniqueName: \"kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.444895 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qv6h\" (UniqueName: \"kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.445029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.445052 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.445637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.445665 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.468215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qv6h\" (UniqueName: \"kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h\") pod \"certified-operators-rk7nw\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:01 crc kubenswrapper[4755]: I1006 09:20:01.574288 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:02 crc kubenswrapper[4755]: I1006 09:20:02.170531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:02 crc kubenswrapper[4755]: I1006 09:20:02.898528 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerID="5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34" exitCode=0 Oct 06 09:20:02 crc kubenswrapper[4755]: I1006 09:20:02.898755 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerDied","Data":"5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34"} Oct 06 09:20:02 crc kubenswrapper[4755]: I1006 09:20:02.902316 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:20:02 crc kubenswrapper[4755]: I1006 09:20:02.903481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerStarted","Data":"08e94c4a0b71659c7d3877a0429b8d212c231f45e79eae99eb4214c68d85aefd"} Oct 06 09:20:03 crc kubenswrapper[4755]: I1006 09:20:03.914785 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerStarted","Data":"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6"} Oct 06 09:20:04 crc kubenswrapper[4755]: I1006 09:20:04.925839 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerID="a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6" exitCode=0 Oct 06 09:20:04 crc kubenswrapper[4755]: I1006 09:20:04.926211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerDied","Data":"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6"} Oct 06 09:20:05 crc kubenswrapper[4755]: I1006 09:20:05.936204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerStarted","Data":"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d"} Oct 06 09:20:05 crc kubenswrapper[4755]: I1006 09:20:05.955632 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rk7nw" podStartSLOduration=2.251159923 podStartE2EDuration="4.955611838s" podCreationTimestamp="2025-10-06 09:20:01 +0000 UTC" firstStartedPulling="2025-10-06 09:20:02.901757484 +0000 UTC m=+3459.731072708" lastFinishedPulling="2025-10-06 09:20:05.606209389 +0000 UTC m=+3462.435524623" observedRunningTime="2025-10-06 09:20:05.953022365 +0000 UTC m=+3462.782337589" watchObservedRunningTime="2025-10-06 09:20:05.955611838 +0000 UTC m=+3462.784927052" Oct 06 09:20:11 crc kubenswrapper[4755]: I1006 09:20:11.575298 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:11 crc kubenswrapper[4755]: I1006 09:20:11.575922 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:11 crc kubenswrapper[4755]: I1006 09:20:11.624965 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:12 crc kubenswrapper[4755]: I1006 09:20:12.035334 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:12 crc kubenswrapper[4755]: I1006 09:20:12.088490 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.015949 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rk7nw" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="registry-server" containerID="cri-o://4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d" gracePeriod=2 Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.608848 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.721597 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities\") pod \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.721871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qv6h\" (UniqueName: \"kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h\") pod \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.722168 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content\") pod \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\" (UID: \"e1d3ab63-5826-49ce-a0d5-9e96b893908a\") " Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.722580 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities" (OuterVolumeSpecName: "utilities") pod "e1d3ab63-5826-49ce-a0d5-9e96b893908a" (UID: "e1d3ab63-5826-49ce-a0d5-9e96b893908a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.722841 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.730524 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h" (OuterVolumeSpecName: "kube-api-access-5qv6h") pod "e1d3ab63-5826-49ce-a0d5-9e96b893908a" (UID: "e1d3ab63-5826-49ce-a0d5-9e96b893908a"). InnerVolumeSpecName "kube-api-access-5qv6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:20:14 crc kubenswrapper[4755]: I1006 09:20:14.825756 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qv6h\" (UniqueName: \"kubernetes.io/projected/e1d3ab63-5826-49ce-a0d5-9e96b893908a-kube-api-access-5qv6h\") on node \"crc\" DevicePath \"\"" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.027000 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerID="4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d" exitCode=0 Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.027045 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerDied","Data":"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d"} Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.027085 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk7nw" event={"ID":"e1d3ab63-5826-49ce-a0d5-9e96b893908a","Type":"ContainerDied","Data":"08e94c4a0b71659c7d3877a0429b8d212c231f45e79eae99eb4214c68d85aefd"} Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.027092 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk7nw" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.027116 4755 scope.go:117] "RemoveContainer" containerID="4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.050252 4755 scope.go:117] "RemoveContainer" containerID="a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.075996 4755 scope.go:117] "RemoveContainer" containerID="5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.109555 4755 scope.go:117] "RemoveContainer" containerID="4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d" Oct 06 09:20:15 crc kubenswrapper[4755]: E1006 09:20:15.109976 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d\": container with ID starting with 4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d not found: ID does not exist" containerID="4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.110008 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d"} err="failed to get container status \"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d\": rpc error: code = NotFound desc = could not find container \"4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d\": container with ID starting with 4d0a9bc784a6156c0c6c5b94a87302d0ec6e0ee1fff6f4a104296f93416ae10d not found: ID does not exist" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.110029 4755 scope.go:117] "RemoveContainer" containerID="a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6" Oct 06 09:20:15 crc kubenswrapper[4755]: E1006 09:20:15.110404 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6\": container with ID starting with a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6 not found: ID does not exist" containerID="a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.110437 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6"} err="failed to get container status \"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6\": rpc error: code = NotFound desc = could not find container \"a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6\": container with ID starting with a588806db31e75a8974e4850c54c2bb25b74558f90c4369716399806adb1b5b6 not found: ID does not exist" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.110456 4755 scope.go:117] "RemoveContainer" containerID="5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34" Oct 06 09:20:15 crc kubenswrapper[4755]: E1006 09:20:15.110774 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34\": container with ID starting with 5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34 not found: ID does not exist" containerID="5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.110820 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34"} err="failed to get container status \"5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34\": rpc error: code = NotFound desc = could not find container \"5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34\": container with ID starting with 5a7d029659d96153f3d0b7666c7dfb545d35646ba2fed1bcf88972a366446e34 not found: ID does not exist" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.393217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1d3ab63-5826-49ce-a0d5-9e96b893908a" (UID: "e1d3ab63-5826-49ce-a0d5-9e96b893908a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.436456 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3ab63-5826-49ce-a0d5-9e96b893908a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.656216 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.665359 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rk7nw"] Oct 06 09:20:15 crc kubenswrapper[4755]: I1006 09:20:15.891771 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" path="/var/lib/kubelet/pods/e1d3ab63-5826-49ce-a0d5-9e96b893908a/volumes" Oct 06 09:20:39 crc kubenswrapper[4755]: I1006 09:20:39.043596 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-r74gz"] Oct 06 09:20:39 crc kubenswrapper[4755]: I1006 09:20:39.052420 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-r74gz"] Oct 06 09:20:39 crc kubenswrapper[4755]: I1006 09:20:39.890956 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bc5e36-48f0-46d8-a2d6-ef94e52c7b96" path="/var/lib/kubelet/pods/95bc5e36-48f0-46d8-a2d6-ef94e52c7b96/volumes" Oct 06 09:20:42 crc kubenswrapper[4755]: I1006 09:20:42.148714 4755 scope.go:117] "RemoveContainer" containerID="e778b1f2d5ce4896aa5c58cc165202c69b8f1b127620513830d5195477c025be" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.825196 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:20:52 crc kubenswrapper[4755]: E1006 09:20:52.826312 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="extract-content" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.826332 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="extract-content" Oct 06 09:20:52 crc kubenswrapper[4755]: E1006 09:20:52.826371 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="extract-utilities" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.826380 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="extract-utilities" Oct 06 09:20:52 crc kubenswrapper[4755]: E1006 09:20:52.826392 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="registry-server" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.826400 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="registry-server" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.826648 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d3ab63-5826-49ce-a0d5-9e96b893908a" containerName="registry-server" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.828652 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.837875 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.894654 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.894727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cz6\" (UniqueName: \"kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.895242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.997388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.997488 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cz6\" (UniqueName: \"kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.997697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.998787 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:52 crc kubenswrapper[4755]: I1006 09:20:52.999159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.022464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cz6\" (UniqueName: \"kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6\") pod \"community-operators-69kbs\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.058166 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-d472-account-create-rbm5d"] Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.090027 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-d472-account-create-rbm5d"] Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.157282 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.771840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:20:53 crc kubenswrapper[4755]: I1006 09:20:53.892143 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce0f617-8015-439d-a175-a596684cf8b9" path="/var/lib/kubelet/pods/1ce0f617-8015-439d-a175-a596684cf8b9/volumes" Oct 06 09:20:54 crc kubenswrapper[4755]: I1006 09:20:54.402084 4755 generic.go:334] "Generic (PLEG): container finished" podID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerID="71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4" exitCode=0 Oct 06 09:20:54 crc kubenswrapper[4755]: I1006 09:20:54.402366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerDied","Data":"71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4"} Oct 06 09:20:54 crc kubenswrapper[4755]: I1006 09:20:54.402390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerStarted","Data":"1bed096c28e8f7be2c8a71298b60aceb21736be37929be0a15542c19cc110eba"} Oct 06 09:20:56 crc kubenswrapper[4755]: I1006 09:20:56.443162 4755 generic.go:334] "Generic (PLEG): container finished" podID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerID="aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef" exitCode=0 Oct 06 09:20:56 crc kubenswrapper[4755]: I1006 09:20:56.443280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerDied","Data":"aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef"} Oct 06 09:20:58 crc kubenswrapper[4755]: I1006 09:20:58.468178 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerStarted","Data":"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c"} Oct 06 09:20:58 crc kubenswrapper[4755]: I1006 09:20:58.490689 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69kbs" podStartSLOduration=3.466198658 podStartE2EDuration="6.490674908s" podCreationTimestamp="2025-10-06 09:20:52 +0000 UTC" firstStartedPulling="2025-10-06 09:20:54.40477408 +0000 UTC m=+3511.234089294" lastFinishedPulling="2025-10-06 09:20:57.42925033 +0000 UTC m=+3514.258565544" observedRunningTime="2025-10-06 09:20:58.488985477 +0000 UTC m=+3515.318300691" watchObservedRunningTime="2025-10-06 09:20:58.490674908 +0000 UTC m=+3515.319990122" Oct 06 09:21:03 crc kubenswrapper[4755]: I1006 09:21:03.157983 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:03 crc kubenswrapper[4755]: I1006 09:21:03.159080 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:03 crc kubenswrapper[4755]: I1006 09:21:03.212789 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:03 crc kubenswrapper[4755]: I1006 09:21:03.576581 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:03 crc kubenswrapper[4755]: I1006 09:21:03.634824 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:21:05 crc kubenswrapper[4755]: I1006 09:21:05.545039 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69kbs" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="registry-server" containerID="cri-o://2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c" gracePeriod=2 Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.218141 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.394010 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities\") pod \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.394135 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content\") pod \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.394406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6cz6\" (UniqueName: \"kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6\") pod \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\" (UID: \"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce\") " Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.394911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities" (OuterVolumeSpecName: "utilities") pod "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" (UID: "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.395591 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.400261 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6" (OuterVolumeSpecName: "kube-api-access-w6cz6") pod "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" (UID: "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce"). InnerVolumeSpecName "kube-api-access-w6cz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.442494 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" (UID: "f50b38da-2a3e-4475-9efa-2b94c0a7d8ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.497515 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6cz6\" (UniqueName: \"kubernetes.io/projected/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-kube-api-access-w6cz6\") on node \"crc\" DevicePath \"\"" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.497861 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.557112 4755 generic.go:334] "Generic (PLEG): container finished" podID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerID="2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c" exitCode=0 Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.557168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerDied","Data":"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c"} Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.557211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69kbs" event={"ID":"f50b38da-2a3e-4475-9efa-2b94c0a7d8ce","Type":"ContainerDied","Data":"1bed096c28e8f7be2c8a71298b60aceb21736be37929be0a15542c19cc110eba"} Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.557237 4755 scope.go:117] "RemoveContainer" containerID="2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.557235 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69kbs" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.582947 4755 scope.go:117] "RemoveContainer" containerID="aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.599178 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.609010 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69kbs"] Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.628396 4755 scope.go:117] "RemoveContainer" containerID="71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.666182 4755 scope.go:117] "RemoveContainer" containerID="2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c" Oct 06 09:21:06 crc kubenswrapper[4755]: E1006 09:21:06.666600 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c\": container with ID starting with 2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c not found: ID does not exist" containerID="2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.666633 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c"} err="failed to get container status \"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c\": rpc error: code = NotFound desc = could not find container \"2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c\": container with ID starting with 2faaef9680ffc40a5938b7b3f025d5e35f01a4b8a8dd80dc98322d7db9dd1b6c not found: ID does not exist" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.666660 4755 scope.go:117] "RemoveContainer" containerID="aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef" Oct 06 09:21:06 crc kubenswrapper[4755]: E1006 09:21:06.667194 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef\": container with ID starting with aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef not found: ID does not exist" containerID="aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.667214 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef"} err="failed to get container status \"aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef\": rpc error: code = NotFound desc = could not find container \"aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef\": container with ID starting with aec1dcde505fa63f134c1dcd02bfb4ac46548b4c5127bdc83ebd86b98f0e56ef not found: ID does not exist" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.667227 4755 scope.go:117] "RemoveContainer" containerID="71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4" Oct 06 09:21:06 crc kubenswrapper[4755]: E1006 09:21:06.667476 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4\": container with ID starting with 71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4 not found: ID does not exist" containerID="71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4" Oct 06 09:21:06 crc kubenswrapper[4755]: I1006 09:21:06.667499 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4"} err="failed to get container status \"71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4\": rpc error: code = NotFound desc = could not find container \"71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4\": container with ID starting with 71671a23b22cbf88d370c5ad5fa73ae4a57ba578d2a85b56be33c4080828ebc4 not found: ID does not exist" Oct 06 09:21:07 crc kubenswrapper[4755]: I1006 09:21:07.889523 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" path="/var/lib/kubelet/pods/f50b38da-2a3e-4475-9efa-2b94c0a7d8ce/volumes" Oct 06 09:21:15 crc kubenswrapper[4755]: I1006 09:21:15.065386 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-m6zbn"] Oct 06 09:21:15 crc kubenswrapper[4755]: I1006 09:21:15.081259 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-m6zbn"] Oct 06 09:21:15 crc kubenswrapper[4755]: I1006 09:21:15.890275 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64aecb78-16d1-476d-852e-9891c850994e" path="/var/lib/kubelet/pods/64aecb78-16d1-476d-852e-9891c850994e/volumes" Oct 06 09:21:42 crc kubenswrapper[4755]: I1006 09:21:42.247134 4755 scope.go:117] "RemoveContainer" containerID="a75777935df112ed2664c3f92b3389441eb91c509cd3313efc81c0c67d8fdf19" Oct 06 09:21:42 crc kubenswrapper[4755]: I1006 09:21:42.285930 4755 scope.go:117] "RemoveContainer" containerID="86e04bde33c90f3e0e7d0575c2e650677d0948577f4a4f813602fc3d2e24e3f6" Oct 06 09:21:48 crc kubenswrapper[4755]: I1006 09:21:48.913068 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:21:48 crc kubenswrapper[4755]: I1006 09:21:48.914230 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:22:18 crc kubenswrapper[4755]: I1006 09:22:18.912565 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:22:18 crc kubenswrapper[4755]: I1006 09:22:18.913077 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:22:48 crc kubenswrapper[4755]: I1006 09:22:48.912013 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:22:48 crc kubenswrapper[4755]: I1006 09:22:48.912696 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:22:48 crc kubenswrapper[4755]: I1006 09:22:48.912762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:22:48 crc kubenswrapper[4755]: I1006 09:22:48.913679 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:22:48 crc kubenswrapper[4755]: I1006 09:22:48.913764 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a" gracePeriod=600 Oct 06 09:22:49 crc kubenswrapper[4755]: I1006 09:22:49.524786 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a" exitCode=0 Oct 06 09:22:49 crc kubenswrapper[4755]: I1006 09:22:49.525304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a"} Oct 06 09:22:49 crc kubenswrapper[4755]: I1006 09:22:49.525332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e"} Oct 06 09:22:49 crc kubenswrapper[4755]: I1006 09:22:49.525347 4755 scope.go:117] "RemoveContainer" containerID="09da59b4d8b85c14720247fb26631e9a71f4c188f8376413c64312486e2e0f20" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.274136 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:31 crc kubenswrapper[4755]: E1006 09:23:31.275020 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="extract-content" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.275032 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="extract-content" Oct 06 09:23:31 crc kubenswrapper[4755]: E1006 09:23:31.275042 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="registry-server" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.275047 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="registry-server" Oct 06 09:23:31 crc kubenswrapper[4755]: E1006 09:23:31.275067 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="extract-utilities" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.275075 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="extract-utilities" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.275282 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50b38da-2a3e-4475-9efa-2b94c0a7d8ce" containerName="registry-server" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.276629 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.283381 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.358655 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.358779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rwj\" (UniqueName: \"kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.358884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.466482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.466686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.466779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rwj\" (UniqueName: \"kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.468040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.468325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.507357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rwj\" (UniqueName: \"kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj\") pod \"redhat-operators-b9789\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:31 crc kubenswrapper[4755]: I1006 09:23:31.616254 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:32 crc kubenswrapper[4755]: I1006 09:23:32.102094 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:32 crc kubenswrapper[4755]: I1006 09:23:32.928661 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerID="da87937689773f7efa62b5237ee257c689f211e0b5cc8c329ce756908048da6d" exitCode=0 Oct 06 09:23:32 crc kubenswrapper[4755]: I1006 09:23:32.928735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerDied","Data":"da87937689773f7efa62b5237ee257c689f211e0b5cc8c329ce756908048da6d"} Oct 06 09:23:32 crc kubenswrapper[4755]: I1006 09:23:32.928990 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerStarted","Data":"69e40ebd11a5123d12fd2c693ce625a0c0b1bdda7705d2eab41d2c1e58fbaaa0"} Oct 06 09:23:34 crc kubenswrapper[4755]: I1006 09:23:34.971810 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerID="245726cdca826d0673a02f303d08a83b37f003d8093000d2d481f84d2ab8ad61" exitCode=0 Oct 06 09:23:34 crc kubenswrapper[4755]: I1006 09:23:34.971929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerDied","Data":"245726cdca826d0673a02f303d08a83b37f003d8093000d2d481f84d2ab8ad61"} Oct 06 09:23:35 crc kubenswrapper[4755]: I1006 09:23:35.985227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerStarted","Data":"f78394c7e928e1293775650fe69578dbf868a4245c81451dc15ead70d48cd21f"} Oct 06 09:23:36 crc kubenswrapper[4755]: I1006 09:23:36.013059 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9789" podStartSLOduration=2.383674294 podStartE2EDuration="5.013040123s" podCreationTimestamp="2025-10-06 09:23:31 +0000 UTC" firstStartedPulling="2025-10-06 09:23:32.930219045 +0000 UTC m=+3669.759534259" lastFinishedPulling="2025-10-06 09:23:35.559584884 +0000 UTC m=+3672.388900088" observedRunningTime="2025-10-06 09:23:36.009806313 +0000 UTC m=+3672.839121517" watchObservedRunningTime="2025-10-06 09:23:36.013040123 +0000 UTC m=+3672.842355337" Oct 06 09:23:41 crc kubenswrapper[4755]: I1006 09:23:41.617055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:41 crc kubenswrapper[4755]: I1006 09:23:41.617533 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:41 crc kubenswrapper[4755]: I1006 09:23:41.670321 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:42 crc kubenswrapper[4755]: I1006 09:23:42.120156 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:42 crc kubenswrapper[4755]: I1006 09:23:42.196595 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:44 crc kubenswrapper[4755]: I1006 09:23:44.082511 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9789" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="registry-server" containerID="cri-o://f78394c7e928e1293775650fe69578dbf868a4245c81451dc15ead70d48cd21f" gracePeriod=2 Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.093571 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerID="f78394c7e928e1293775650fe69578dbf868a4245c81451dc15ead70d48cd21f" exitCode=0 Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.093615 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerDied","Data":"f78394c7e928e1293775650fe69578dbf868a4245c81451dc15ead70d48cd21f"} Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.520786 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.592990 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content\") pod \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.593279 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7rwj\" (UniqueName: \"kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj\") pod \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.595927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities\") pod \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\" (UID: \"1ce111ad-3b52-4f87-a88f-fa64a7edfb69\") " Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.597368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities" (OuterVolumeSpecName: "utilities") pod "1ce111ad-3b52-4f87-a88f-fa64a7edfb69" (UID: "1ce111ad-3b52-4f87-a88f-fa64a7edfb69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.597856 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.609846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj" (OuterVolumeSpecName: "kube-api-access-b7rwj") pod "1ce111ad-3b52-4f87-a88f-fa64a7edfb69" (UID: "1ce111ad-3b52-4f87-a88f-fa64a7edfb69"). InnerVolumeSpecName "kube-api-access-b7rwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.670225 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ce111ad-3b52-4f87-a88f-fa64a7edfb69" (UID: "1ce111ad-3b52-4f87-a88f-fa64a7edfb69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.699545 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:23:45 crc kubenswrapper[4755]: I1006 09:23:45.699611 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7rwj\" (UniqueName: \"kubernetes.io/projected/1ce111ad-3b52-4f87-a88f-fa64a7edfb69-kube-api-access-b7rwj\") on node \"crc\" DevicePath \"\"" Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.108771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9789" event={"ID":"1ce111ad-3b52-4f87-a88f-fa64a7edfb69","Type":"ContainerDied","Data":"69e40ebd11a5123d12fd2c693ce625a0c0b1bdda7705d2eab41d2c1e58fbaaa0"} Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.109350 4755 scope.go:117] "RemoveContainer" containerID="f78394c7e928e1293775650fe69578dbf868a4245c81451dc15ead70d48cd21f" Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.108842 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9789" Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.139276 4755 scope.go:117] "RemoveContainer" containerID="245726cdca826d0673a02f303d08a83b37f003d8093000d2d481f84d2ab8ad61" Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.144013 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.159039 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9789"] Oct 06 09:23:46 crc kubenswrapper[4755]: I1006 09:23:46.167138 4755 scope.go:117] "RemoveContainer" containerID="da87937689773f7efa62b5237ee257c689f211e0b5cc8c329ce756908048da6d" Oct 06 09:23:47 crc kubenswrapper[4755]: I1006 09:23:47.890007 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" path="/var/lib/kubelet/pods/1ce111ad-3b52-4f87-a88f-fa64a7edfb69/volumes" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.176748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:04 crc kubenswrapper[4755]: E1006 09:25:04.177747 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="extract-utilities" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.177764 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="extract-utilities" Oct 06 09:25:04 crc kubenswrapper[4755]: E1006 09:25:04.177789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="registry-server" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.177804 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="registry-server" Oct 06 09:25:04 crc kubenswrapper[4755]: E1006 09:25:04.177826 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="extract-content" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.177832 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="extract-content" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.178046 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce111ad-3b52-4f87-a88f-fa64a7edfb69" containerName="registry-server" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.181143 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.195341 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.281918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.282217 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tgl\" (UniqueName: \"kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.282367 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.383489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.383541 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tgl\" (UniqueName: \"kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.383689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.384120 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.384327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.402904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tgl\" (UniqueName: \"kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl\") pod \"redhat-marketplace-skkfc\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:04 crc kubenswrapper[4755]: I1006 09:25:04.509037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:05 crc kubenswrapper[4755]: I1006 09:25:05.031023 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:05 crc kubenswrapper[4755]: I1006 09:25:05.827794 4755 generic.go:334] "Generic (PLEG): container finished" podID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerID="fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7" exitCode=0 Oct 06 09:25:05 crc kubenswrapper[4755]: I1006 09:25:05.827937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerDied","Data":"fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7"} Oct 06 09:25:05 crc kubenswrapper[4755]: I1006 09:25:05.828123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerStarted","Data":"e190998a3421a939168883e833a3896e5ed237b704e8c535f83f932af026f874"} Oct 06 09:25:05 crc kubenswrapper[4755]: I1006 09:25:05.830406 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:25:06 crc kubenswrapper[4755]: I1006 09:25:06.837888 4755 generic.go:334] "Generic (PLEG): container finished" podID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerID="1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45" exitCode=0 Oct 06 09:25:06 crc kubenswrapper[4755]: I1006 09:25:06.837994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerDied","Data":"1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45"} Oct 06 09:25:07 crc kubenswrapper[4755]: I1006 09:25:07.848344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerStarted","Data":"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8"} Oct 06 09:25:07 crc kubenswrapper[4755]: I1006 09:25:07.864367 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skkfc" podStartSLOduration=2.32124624 podStartE2EDuration="3.864351455s" podCreationTimestamp="2025-10-06 09:25:04 +0000 UTC" firstStartedPulling="2025-10-06 09:25:05.830206225 +0000 UTC m=+3762.659521439" lastFinishedPulling="2025-10-06 09:25:07.37331144 +0000 UTC m=+3764.202626654" observedRunningTime="2025-10-06 09:25:07.862215003 +0000 UTC m=+3764.691530227" watchObservedRunningTime="2025-10-06 09:25:07.864351455 +0000 UTC m=+3764.693666669" Oct 06 09:25:14 crc kubenswrapper[4755]: I1006 09:25:14.509447 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:14 crc kubenswrapper[4755]: I1006 09:25:14.510094 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:14 crc kubenswrapper[4755]: I1006 09:25:14.559861 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:14 crc kubenswrapper[4755]: I1006 09:25:14.965687 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:15 crc kubenswrapper[4755]: I1006 09:25:15.010710 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:16 crc kubenswrapper[4755]: I1006 09:25:16.931882 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skkfc" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="registry-server" containerID="cri-o://5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8" gracePeriod=2 Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.536305 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.641624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities\") pod \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.641741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2tgl\" (UniqueName: \"kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl\") pod \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.641963 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content\") pod \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\" (UID: \"6e6e5a44-8bd8-4e65-8dec-4c236175213b\") " Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.642665 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities" (OuterVolumeSpecName: "utilities") pod "6e6e5a44-8bd8-4e65-8dec-4c236175213b" (UID: "6e6e5a44-8bd8-4e65-8dec-4c236175213b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.643205 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.646585 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl" (OuterVolumeSpecName: "kube-api-access-b2tgl") pod "6e6e5a44-8bd8-4e65-8dec-4c236175213b" (UID: "6e6e5a44-8bd8-4e65-8dec-4c236175213b"). InnerVolumeSpecName "kube-api-access-b2tgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.654595 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e6e5a44-8bd8-4e65-8dec-4c236175213b" (UID: "6e6e5a44-8bd8-4e65-8dec-4c236175213b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.746133 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2tgl\" (UniqueName: \"kubernetes.io/projected/6e6e5a44-8bd8-4e65-8dec-4c236175213b-kube-api-access-b2tgl\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.746592 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6e5a44-8bd8-4e65-8dec-4c236175213b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.947004 4755 generic.go:334] "Generic (PLEG): container finished" podID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerID="5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8" exitCode=0 Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.947093 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkfc" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.947679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerDied","Data":"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8"} Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.947771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkfc" event={"ID":"6e6e5a44-8bd8-4e65-8dec-4c236175213b","Type":"ContainerDied","Data":"e190998a3421a939168883e833a3896e5ed237b704e8c535f83f932af026f874"} Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.947805 4755 scope.go:117] "RemoveContainer" containerID="5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.985224 4755 scope.go:117] "RemoveContainer" containerID="1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45" Oct 06 09:25:17 crc kubenswrapper[4755]: I1006 09:25:17.998545 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.009053 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkfc"] Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.016967 4755 scope.go:117] "RemoveContainer" containerID="fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.062408 4755 scope.go:117] "RemoveContainer" containerID="5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8" Oct 06 09:25:18 crc kubenswrapper[4755]: E1006 09:25:18.062855 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8\": container with ID starting with 5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8 not found: ID does not exist" containerID="5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.062888 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8"} err="failed to get container status \"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8\": rpc error: code = NotFound desc = could not find container \"5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8\": container with ID starting with 5b8c8918e5ecf0f29d7b0e17e7b0a3a852042223ef66f91718a4a0f4178be3c8 not found: ID does not exist" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.062908 4755 scope.go:117] "RemoveContainer" containerID="1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45" Oct 06 09:25:18 crc kubenswrapper[4755]: E1006 09:25:18.063359 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45\": container with ID starting with 1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45 not found: ID does not exist" containerID="1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.063410 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45"} err="failed to get container status \"1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45\": rpc error: code = NotFound desc = could not find container \"1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45\": container with ID starting with 1a9f4fad7e01903c5b1106a8e4f37558e13cc3d5f6eb4a3a317fe08169dd9a45 not found: ID does not exist" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.063461 4755 scope.go:117] "RemoveContainer" containerID="fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7" Oct 06 09:25:18 crc kubenswrapper[4755]: E1006 09:25:18.063953 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7\": container with ID starting with fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7 not found: ID does not exist" containerID="fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.064054 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7"} err="failed to get container status \"fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7\": rpc error: code = NotFound desc = could not find container \"fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7\": container with ID starting with fec227fee93e7fdb7fab0d7358d84f751811dc1b549e0e6757588c8e7bf228b7 not found: ID does not exist" Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.912954 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:25:18 crc kubenswrapper[4755]: I1006 09:25:18.913665 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:25:19 crc kubenswrapper[4755]: I1006 09:25:19.903108 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" path="/var/lib/kubelet/pods/6e6e5a44-8bd8-4e65-8dec-4c236175213b/volumes" Oct 06 09:25:48 crc kubenswrapper[4755]: I1006 09:25:48.911975 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:25:48 crc kubenswrapper[4755]: I1006 09:25:48.913501 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:26:18 crc kubenswrapper[4755]: I1006 09:26:18.912914 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:26:18 crc kubenswrapper[4755]: I1006 09:26:18.913548 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:26:18 crc kubenswrapper[4755]: I1006 09:26:18.913626 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:26:18 crc kubenswrapper[4755]: I1006 09:26:18.914376 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:26:18 crc kubenswrapper[4755]: I1006 09:26:18.914420 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" gracePeriod=600 Oct 06 09:26:19 crc kubenswrapper[4755]: I1006 09:26:19.465837 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" exitCode=0 Oct 06 09:26:19 crc kubenswrapper[4755]: I1006 09:26:19.465916 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e"} Oct 06 09:26:19 crc kubenswrapper[4755]: I1006 09:26:19.466220 4755 scope.go:117] "RemoveContainer" containerID="07fcb6bf36260f4366321f1ce8755cfe4cd512cf29960b407d348b3ea9e1e47a" Oct 06 09:26:19 crc kubenswrapper[4755]: E1006 09:26:19.482082 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:26:20 crc kubenswrapper[4755]: I1006 09:26:20.477453 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:26:20 crc kubenswrapper[4755]: E1006 09:26:20.478070 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:26:35 crc kubenswrapper[4755]: I1006 09:26:35.878774 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:26:35 crc kubenswrapper[4755]: E1006 09:26:35.881043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:26:49 crc kubenswrapper[4755]: I1006 09:26:49.878727 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:26:49 crc kubenswrapper[4755]: E1006 09:26:49.879478 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:27:03 crc kubenswrapper[4755]: I1006 09:27:03.887013 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:27:03 crc kubenswrapper[4755]: E1006 09:27:03.887966 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:27:15 crc kubenswrapper[4755]: I1006 09:27:15.878608 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:27:15 crc kubenswrapper[4755]: E1006 09:27:15.879391 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:27:26 crc kubenswrapper[4755]: I1006 09:27:26.879331 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:27:26 crc kubenswrapper[4755]: E1006 09:27:26.880183 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:27:40 crc kubenswrapper[4755]: I1006 09:27:40.879809 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:27:40 crc kubenswrapper[4755]: E1006 09:27:40.881087 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:27:53 crc kubenswrapper[4755]: I1006 09:27:53.886255 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:27:53 crc kubenswrapper[4755]: E1006 09:27:53.887189 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:28:06 crc kubenswrapper[4755]: I1006 09:28:06.879195 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:28:06 crc kubenswrapper[4755]: E1006 09:28:06.880093 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:28:20 crc kubenswrapper[4755]: I1006 09:28:20.879208 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:28:20 crc kubenswrapper[4755]: E1006 09:28:20.880072 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:28:33 crc kubenswrapper[4755]: I1006 09:28:33.887215 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:28:33 crc kubenswrapper[4755]: E1006 09:28:33.888055 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:28:47 crc kubenswrapper[4755]: I1006 09:28:47.879926 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:28:47 crc kubenswrapper[4755]: E1006 09:28:47.880692 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:29:01 crc kubenswrapper[4755]: I1006 09:29:01.879540 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:29:01 crc kubenswrapper[4755]: E1006 09:29:01.880338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:29:12 crc kubenswrapper[4755]: I1006 09:29:12.879738 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:29:12 crc kubenswrapper[4755]: E1006 09:29:12.880857 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:29:26 crc kubenswrapper[4755]: I1006 09:29:26.880261 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:29:26 crc kubenswrapper[4755]: E1006 09:29:26.881593 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:29:41 crc kubenswrapper[4755]: I1006 09:29:41.879511 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:29:41 crc kubenswrapper[4755]: E1006 09:29:41.880414 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:29:53 crc kubenswrapper[4755]: I1006 09:29:53.886776 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:29:53 crc kubenswrapper[4755]: E1006 09:29:53.899256 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.143660 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78"] Oct 06 09:30:00 crc kubenswrapper[4755]: E1006 09:30:00.144752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="extract-content" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.144771 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="extract-content" Oct 06 09:30:00 crc kubenswrapper[4755]: E1006 09:30:00.144781 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="extract-utilities" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.144789 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="extract-utilities" Oct 06 09:30:00 crc kubenswrapper[4755]: E1006 09:30:00.144803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.144812 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.145110 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6e5a44-8bd8-4e65-8dec-4c236175213b" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.145960 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.148298 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.148344 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.154374 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78"] Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.178039 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.178205 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7km\" (UniqueName: \"kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.178248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.280179 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.280897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7km\" (UniqueName: \"kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.281123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.281242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.288513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.299431 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7km\" (UniqueName: \"kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km\") pod \"collect-profiles-29329050-d9p78\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:00 crc kubenswrapper[4755]: I1006 09:30:00.466862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:01 crc kubenswrapper[4755]: I1006 09:30:01.064163 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78"] Oct 06 09:30:01 crc kubenswrapper[4755]: I1006 09:30:01.366523 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" event={"ID":"e71ed3cf-3a95-4944-9118-5e2d809703e6","Type":"ContainerStarted","Data":"16513a30ced1db1697b087c32f826dbd14873ce55a04182e1509eb07a587454c"} Oct 06 09:30:01 crc kubenswrapper[4755]: I1006 09:30:01.366826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" event={"ID":"e71ed3cf-3a95-4944-9118-5e2d809703e6","Type":"ContainerStarted","Data":"ce7ef3448cce0dd87da1599dd48efdb24c0743122e710d0a95bf4af4a5cac001"} Oct 06 09:30:01 crc kubenswrapper[4755]: I1006 09:30:01.383766 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" podStartSLOduration=1.383744442 podStartE2EDuration="1.383744442s" podCreationTimestamp="2025-10-06 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:30:01.381687451 +0000 UTC m=+4058.211002685" watchObservedRunningTime="2025-10-06 09:30:01.383744442 +0000 UTC m=+4058.213059656" Oct 06 09:30:01 crc kubenswrapper[4755]: E1006 09:30:01.807679 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71ed3cf_3a95_4944_9118_5e2d809703e6.slice/crio-conmon-16513a30ced1db1697b087c32f826dbd14873ce55a04182e1509eb07a587454c.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:30:02 crc kubenswrapper[4755]: I1006 09:30:02.378912 4755 generic.go:334] "Generic (PLEG): container finished" podID="e71ed3cf-3a95-4944-9118-5e2d809703e6" containerID="16513a30ced1db1697b087c32f826dbd14873ce55a04182e1509eb07a587454c" exitCode=0 Oct 06 09:30:02 crc kubenswrapper[4755]: I1006 09:30:02.379021 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" event={"ID":"e71ed3cf-3a95-4944-9118-5e2d809703e6","Type":"ContainerDied","Data":"16513a30ced1db1697b087c32f826dbd14873ce55a04182e1509eb07a587454c"} Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.844349 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.975096 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq7km\" (UniqueName: \"kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km\") pod \"e71ed3cf-3a95-4944-9118-5e2d809703e6\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.975287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume\") pod \"e71ed3cf-3a95-4944-9118-5e2d809703e6\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.975360 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume\") pod \"e71ed3cf-3a95-4944-9118-5e2d809703e6\" (UID: \"e71ed3cf-3a95-4944-9118-5e2d809703e6\") " Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.976315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "e71ed3cf-3a95-4944-9118-5e2d809703e6" (UID: "e71ed3cf-3a95-4944-9118-5e2d809703e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.994506 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e71ed3cf-3a95-4944-9118-5e2d809703e6" (UID: "e71ed3cf-3a95-4944-9118-5e2d809703e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:30:03 crc kubenswrapper[4755]: I1006 09:30:03.996677 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km" (OuterVolumeSpecName: "kube-api-access-tq7km") pod "e71ed3cf-3a95-4944-9118-5e2d809703e6" (UID: "e71ed3cf-3a95-4944-9118-5e2d809703e6"). InnerVolumeSpecName "kube-api-access-tq7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.078884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq7km\" (UniqueName: \"kubernetes.io/projected/e71ed3cf-3a95-4944-9118-5e2d809703e6-kube-api-access-tq7km\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.078982 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e71ed3cf-3a95-4944-9118-5e2d809703e6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.079001 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e71ed3cf-3a95-4944-9118-5e2d809703e6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.400710 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" event={"ID":"e71ed3cf-3a95-4944-9118-5e2d809703e6","Type":"ContainerDied","Data":"ce7ef3448cce0dd87da1599dd48efdb24c0743122e710d0a95bf4af4a5cac001"} Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.401030 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7ef3448cce0dd87da1599dd48efdb24c0743122e710d0a95bf4af4a5cac001" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.400786 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-d9p78" Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.466790 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv"] Oct 06 09:30:04 crc kubenswrapper[4755]: I1006 09:30:04.474390 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-sn7rv"] Oct 06 09:30:05 crc kubenswrapper[4755]: I1006 09:30:05.892238 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7c743b-75a0-4790-81a2-15f9c39d4624" path="/var/lib/kubelet/pods/db7c743b-75a0-4790-81a2-15f9c39d4624/volumes" Oct 06 09:30:06 crc kubenswrapper[4755]: I1006 09:30:06.878824 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:30:06 crc kubenswrapper[4755]: E1006 09:30:06.879167 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:30:19 crc kubenswrapper[4755]: I1006 09:30:19.879135 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:30:19 crc kubenswrapper[4755]: E1006 09:30:19.880429 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:30:31 crc kubenswrapper[4755]: I1006 09:30:31.879550 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:30:31 crc kubenswrapper[4755]: E1006 09:30:31.880475 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:30:42 crc kubenswrapper[4755]: I1006 09:30:42.584289 4755 scope.go:117] "RemoveContainer" containerID="085f860133848095c4ac4996dbd52a34afd20d15ef2845340ead8f3447701876" Oct 06 09:30:45 crc kubenswrapper[4755]: I1006 09:30:45.879400 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:30:45 crc kubenswrapper[4755]: E1006 09:30:45.880263 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:30:57 crc kubenswrapper[4755]: I1006 09:30:57.879387 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:30:57 crc kubenswrapper[4755]: E1006 09:30:57.880539 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.099755 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:03 crc kubenswrapper[4755]: E1006 09:31:03.100803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71ed3cf-3a95-4944-9118-5e2d809703e6" containerName="collect-profiles" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.100824 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71ed3cf-3a95-4944-9118-5e2d809703e6" containerName="collect-profiles" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.101085 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71ed3cf-3a95-4944-9118-5e2d809703e6" containerName="collect-profiles" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.102897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.110197 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.216120 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.216182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.216262 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfkw\" (UniqueName: \"kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.318388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.318764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.318898 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.318997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfkw\" (UniqueName: \"kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.318996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:03 crc kubenswrapper[4755]: I1006 09:31:03.782752 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfkw\" (UniqueName: \"kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw\") pod \"community-operators-9qr9m\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.030372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.534718 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:04 crc kubenswrapper[4755]: W1006 09:31:04.538402 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac92d25b_a788_4479_885a_2ae1d30cc38e.slice/crio-df16c7343a4254a10eb491850bccbc55f575ed1a9819d7227326c5092e39b674 WatchSource:0}: Error finding container df16c7343a4254a10eb491850bccbc55f575ed1a9819d7227326c5092e39b674: Status 404 returned error can't find the container with id df16c7343a4254a10eb491850bccbc55f575ed1a9819d7227326c5092e39b674 Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.984473 4755 generic.go:334] "Generic (PLEG): container finished" podID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerID="39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab" exitCode=0 Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.984535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerDied","Data":"39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab"} Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.984872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerStarted","Data":"df16c7343a4254a10eb491850bccbc55f575ed1a9819d7227326c5092e39b674"} Oct 06 09:31:04 crc kubenswrapper[4755]: I1006 09:31:04.986440 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:31:05 crc kubenswrapper[4755]: I1006 09:31:05.995506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerStarted","Data":"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf"} Oct 06 09:31:07 crc kubenswrapper[4755]: I1006 09:31:07.005090 4755 generic.go:334] "Generic (PLEG): container finished" podID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerID="5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf" exitCode=0 Oct 06 09:31:07 crc kubenswrapper[4755]: I1006 09:31:07.005155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerDied","Data":"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf"} Oct 06 09:31:08 crc kubenswrapper[4755]: I1006 09:31:08.879376 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:31:08 crc kubenswrapper[4755]: E1006 09:31:08.880349 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:31:09 crc kubenswrapper[4755]: I1006 09:31:09.028058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerStarted","Data":"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2"} Oct 06 09:31:09 crc kubenswrapper[4755]: I1006 09:31:09.050672 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9qr9m" podStartSLOduration=3.137439605 podStartE2EDuration="6.05064837s" podCreationTimestamp="2025-10-06 09:31:03 +0000 UTC" firstStartedPulling="2025-10-06 09:31:04.98620897 +0000 UTC m=+4121.815524184" lastFinishedPulling="2025-10-06 09:31:07.899417735 +0000 UTC m=+4124.728732949" observedRunningTime="2025-10-06 09:31:09.043334021 +0000 UTC m=+4125.872649225" watchObservedRunningTime="2025-10-06 09:31:09.05064837 +0000 UTC m=+4125.879963584" Oct 06 09:31:14 crc kubenswrapper[4755]: I1006 09:31:14.031090 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:14 crc kubenswrapper[4755]: I1006 09:31:14.032900 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:14 crc kubenswrapper[4755]: I1006 09:31:14.080762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:14 crc kubenswrapper[4755]: I1006 09:31:14.133048 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:14 crc kubenswrapper[4755]: I1006 09:31:14.326524 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.099072 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9qr9m" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="registry-server" containerID="cri-o://28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2" gracePeriod=2 Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.651347 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.734339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hfkw\" (UniqueName: \"kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw\") pod \"ac92d25b-a788-4479-885a-2ae1d30cc38e\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.734682 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content\") pod \"ac92d25b-a788-4479-885a-2ae1d30cc38e\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.734783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities\") pod \"ac92d25b-a788-4479-885a-2ae1d30cc38e\" (UID: \"ac92d25b-a788-4479-885a-2ae1d30cc38e\") " Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.735878 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities" (OuterVolumeSpecName: "utilities") pod "ac92d25b-a788-4479-885a-2ae1d30cc38e" (UID: "ac92d25b-a788-4479-885a-2ae1d30cc38e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.749946 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw" (OuterVolumeSpecName: "kube-api-access-9hfkw") pod "ac92d25b-a788-4479-885a-2ae1d30cc38e" (UID: "ac92d25b-a788-4479-885a-2ae1d30cc38e"). InnerVolumeSpecName "kube-api-access-9hfkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.778029 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac92d25b-a788-4479-885a-2ae1d30cc38e" (UID: "ac92d25b-a788-4479-885a-2ae1d30cc38e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.838049 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hfkw\" (UniqueName: \"kubernetes.io/projected/ac92d25b-a788-4479-885a-2ae1d30cc38e-kube-api-access-9hfkw\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.838082 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:16 crc kubenswrapper[4755]: I1006 09:31:16.838095 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac92d25b-a788-4479-885a-2ae1d30cc38e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.112946 4755 generic.go:334] "Generic (PLEG): container finished" podID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerID="28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2" exitCode=0 Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.113024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerDied","Data":"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2"} Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.113078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qr9m" event={"ID":"ac92d25b-a788-4479-885a-2ae1d30cc38e","Type":"ContainerDied","Data":"df16c7343a4254a10eb491850bccbc55f575ed1a9819d7227326c5092e39b674"} Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.113107 4755 scope.go:117] "RemoveContainer" containerID="28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.113341 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qr9m" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.147867 4755 scope.go:117] "RemoveContainer" containerID="5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.153468 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.166547 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9qr9m"] Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.182148 4755 scope.go:117] "RemoveContainer" containerID="39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.232251 4755 scope.go:117] "RemoveContainer" containerID="28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2" Oct 06 09:31:17 crc kubenswrapper[4755]: E1006 09:31:17.233460 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2\": container with ID starting with 28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2 not found: ID does not exist" containerID="28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.233680 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2"} err="failed to get container status \"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2\": rpc error: code = NotFound desc = could not find container \"28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2\": container with ID starting with 28d0362c73120d95d12412a46686a0a67c53ff70f3d97dcae62e29c06fdc06e2 not found: ID does not exist" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.233789 4755 scope.go:117] "RemoveContainer" containerID="5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf" Oct 06 09:31:17 crc kubenswrapper[4755]: E1006 09:31:17.234656 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf\": container with ID starting with 5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf not found: ID does not exist" containerID="5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.234724 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf"} err="failed to get container status \"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf\": rpc error: code = NotFound desc = could not find container \"5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf\": container with ID starting with 5227d0318142ec469bce9c04ee1de175eb8c1b6e009ff5d3e4616ddf4d6bb6bf not found: ID does not exist" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.234770 4755 scope.go:117] "RemoveContainer" containerID="39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab" Oct 06 09:31:17 crc kubenswrapper[4755]: E1006 09:31:17.235282 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab\": container with ID starting with 39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab not found: ID does not exist" containerID="39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.235339 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab"} err="failed to get container status \"39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab\": rpc error: code = NotFound desc = could not find container \"39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab\": container with ID starting with 39306401d3fb79714711d08fa0f7830b23c25f8ba4409ae81842567af6d686ab not found: ID does not exist" Oct 06 09:31:17 crc kubenswrapper[4755]: I1006 09:31:17.890088 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" path="/var/lib/kubelet/pods/ac92d25b-a788-4479-885a-2ae1d30cc38e/volumes" Oct 06 09:31:19 crc kubenswrapper[4755]: I1006 09:31:19.878865 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:31:20 crc kubenswrapper[4755]: I1006 09:31:20.145319 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468"} Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.355652 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:22 crc kubenswrapper[4755]: E1006 09:31:22.356913 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="registry-server" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.356933 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="registry-server" Oct 06 09:31:22 crc kubenswrapper[4755]: E1006 09:31:22.356955 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="extract-utilities" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.356963 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="extract-utilities" Oct 06 09:31:22 crc kubenswrapper[4755]: E1006 09:31:22.356979 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="extract-content" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.356987 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="extract-content" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.357244 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac92d25b-a788-4479-885a-2ae1d30cc38e" containerName="registry-server" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.371980 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.376954 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.478600 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr8tj\" (UniqueName: \"kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.479027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.479483 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.582121 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.582297 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr8tj\" (UniqueName: \"kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.582850 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.583129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.583525 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.620033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr8tj\" (UniqueName: \"kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj\") pod \"certified-operators-r5gn6\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:22 crc kubenswrapper[4755]: I1006 09:31:22.702631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:23 crc kubenswrapper[4755]: I1006 09:31:23.376499 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:23 crc kubenswrapper[4755]: W1006 09:31:23.380456 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698bf960_862e_4d04_88db_eea4785846cf.slice/crio-7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b WatchSource:0}: Error finding container 7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b: Status 404 returned error can't find the container with id 7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b Oct 06 09:31:24 crc kubenswrapper[4755]: I1006 09:31:24.205183 4755 generic.go:334] "Generic (PLEG): container finished" podID="698bf960-862e-4d04-88db-eea4785846cf" containerID="56f99d7950e33f544cc60cd3b0bd2740fb3b383d95401af5d02ffa7f459c87ab" exitCode=0 Oct 06 09:31:24 crc kubenswrapper[4755]: I1006 09:31:24.205365 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerDied","Data":"56f99d7950e33f544cc60cd3b0bd2740fb3b383d95401af5d02ffa7f459c87ab"} Oct 06 09:31:24 crc kubenswrapper[4755]: I1006 09:31:24.205745 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerStarted","Data":"7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b"} Oct 06 09:31:26 crc kubenswrapper[4755]: I1006 09:31:26.231365 4755 generic.go:334] "Generic (PLEG): container finished" podID="698bf960-862e-4d04-88db-eea4785846cf" containerID="ed3a014dec3370fd090fcab596b780da47cb2e6a294e2568314b58e59196c216" exitCode=0 Oct 06 09:31:26 crc kubenswrapper[4755]: I1006 09:31:26.231458 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerDied","Data":"ed3a014dec3370fd090fcab596b780da47cb2e6a294e2568314b58e59196c216"} Oct 06 09:31:27 crc kubenswrapper[4755]: I1006 09:31:27.242245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerStarted","Data":"253cadd2f3bdb93a7c31dee8a2e6f1c27e2826aee4f7f61dad395d0708c0f0bb"} Oct 06 09:31:32 crc kubenswrapper[4755]: I1006 09:31:32.704429 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:32 crc kubenswrapper[4755]: I1006 09:31:32.705010 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:32 crc kubenswrapper[4755]: I1006 09:31:32.767591 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:32 crc kubenswrapper[4755]: I1006 09:31:32.793332 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r5gn6" podStartSLOduration=8.266624816 podStartE2EDuration="10.793309352s" podCreationTimestamp="2025-10-06 09:31:22 +0000 UTC" firstStartedPulling="2025-10-06 09:31:24.209600484 +0000 UTC m=+4141.038915698" lastFinishedPulling="2025-10-06 09:31:26.73628502 +0000 UTC m=+4143.565600234" observedRunningTime="2025-10-06 09:31:27.273271703 +0000 UTC m=+4144.102586967" watchObservedRunningTime="2025-10-06 09:31:32.793309352 +0000 UTC m=+4149.622624566" Oct 06 09:31:33 crc kubenswrapper[4755]: I1006 09:31:33.377990 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:34 crc kubenswrapper[4755]: I1006 09:31:34.005800 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:35 crc kubenswrapper[4755]: I1006 09:31:35.346496 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r5gn6" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="registry-server" containerID="cri-o://253cadd2f3bdb93a7c31dee8a2e6f1c27e2826aee4f7f61dad395d0708c0f0bb" gracePeriod=2 Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.356614 4755 generic.go:334] "Generic (PLEG): container finished" podID="698bf960-862e-4d04-88db-eea4785846cf" containerID="253cadd2f3bdb93a7c31dee8a2e6f1c27e2826aee4f7f61dad395d0708c0f0bb" exitCode=0 Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.356692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerDied","Data":"253cadd2f3bdb93a7c31dee8a2e6f1c27e2826aee4f7f61dad395d0708c0f0bb"} Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.356941 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gn6" event={"ID":"698bf960-862e-4d04-88db-eea4785846cf","Type":"ContainerDied","Data":"7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b"} Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.356958 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df141b2cfb99db583b139382611e4c82ad5dd8b5a0391f068f7d51c2547052b" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.514381 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.607218 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content\") pod \"698bf960-862e-4d04-88db-eea4785846cf\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.607440 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr8tj\" (UniqueName: \"kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj\") pod \"698bf960-862e-4d04-88db-eea4785846cf\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.607534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities\") pod \"698bf960-862e-4d04-88db-eea4785846cf\" (UID: \"698bf960-862e-4d04-88db-eea4785846cf\") " Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.609942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities" (OuterVolumeSpecName: "utilities") pod "698bf960-862e-4d04-88db-eea4785846cf" (UID: "698bf960-862e-4d04-88db-eea4785846cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.621888 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj" (OuterVolumeSpecName: "kube-api-access-cr8tj") pod "698bf960-862e-4d04-88db-eea4785846cf" (UID: "698bf960-862e-4d04-88db-eea4785846cf"). InnerVolumeSpecName "kube-api-access-cr8tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.712128 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr8tj\" (UniqueName: \"kubernetes.io/projected/698bf960-862e-4d04-88db-eea4785846cf-kube-api-access-cr8tj\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.712191 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.901104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "698bf960-862e-4d04-88db-eea4785846cf" (UID: "698bf960-862e-4d04-88db-eea4785846cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:36 crc kubenswrapper[4755]: I1006 09:31:36.916496 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698bf960-862e-4d04-88db-eea4785846cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:37 crc kubenswrapper[4755]: I1006 09:31:37.367964 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gn6" Oct 06 09:31:37 crc kubenswrapper[4755]: I1006 09:31:37.412052 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:37 crc kubenswrapper[4755]: I1006 09:31:37.420530 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r5gn6"] Oct 06 09:31:37 crc kubenswrapper[4755]: I1006 09:31:37.889309 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698bf960-862e-4d04-88db-eea4785846cf" path="/var/lib/kubelet/pods/698bf960-862e-4d04-88db-eea4785846cf/volumes" Oct 06 09:33:48 crc kubenswrapper[4755]: I1006 09:33:48.912781 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:33:48 crc kubenswrapper[4755]: I1006 09:33:48.914239 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:34:18 crc kubenswrapper[4755]: I1006 09:34:18.912138 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:34:18 crc kubenswrapper[4755]: I1006 09:34:18.912666 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.983905 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:20 crc kubenswrapper[4755]: E1006 09:34:20.984729 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="extract-utilities" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.984748 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="extract-utilities" Oct 06 09:34:20 crc kubenswrapper[4755]: E1006 09:34:20.984766 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="extract-content" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.984774 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="extract-content" Oct 06 09:34:20 crc kubenswrapper[4755]: E1006 09:34:20.984786 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="registry-server" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.984794 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="registry-server" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.985041 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="698bf960-862e-4d04-88db-eea4785846cf" containerName="registry-server" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.986721 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:20 crc kubenswrapper[4755]: I1006 09:34:20.997551 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.127195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4r64\" (UniqueName: \"kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.127292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.127442 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.228627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.228736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4r64\" (UniqueName: \"kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.228795 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.229265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.229272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.247284 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4r64\" (UniqueName: \"kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64\") pod \"redhat-operators-gn9ml\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.338514 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.851209 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:21 crc kubenswrapper[4755]: I1006 09:34:21.867735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerStarted","Data":"5988f87123e232dc5d9a97344cf79dd3ded26f43c291fceebd8d46ce2398c7aa"} Oct 06 09:34:22 crc kubenswrapper[4755]: I1006 09:34:22.877775 4755 generic.go:334] "Generic (PLEG): container finished" podID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerID="0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609" exitCode=0 Oct 06 09:34:22 crc kubenswrapper[4755]: I1006 09:34:22.877812 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerDied","Data":"0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609"} Oct 06 09:34:23 crc kubenswrapper[4755]: I1006 09:34:23.900932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerStarted","Data":"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a"} Oct 06 09:34:24 crc kubenswrapper[4755]: I1006 09:34:24.911244 4755 generic.go:334] "Generic (PLEG): container finished" podID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerID="3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a" exitCode=0 Oct 06 09:34:24 crc kubenswrapper[4755]: I1006 09:34:24.911353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerDied","Data":"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a"} Oct 06 09:34:25 crc kubenswrapper[4755]: I1006 09:34:25.920957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerStarted","Data":"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d"} Oct 06 09:34:25 crc kubenswrapper[4755]: I1006 09:34:25.946495 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gn9ml" podStartSLOduration=3.473833809 podStartE2EDuration="5.946286802s" podCreationTimestamp="2025-10-06 09:34:20 +0000 UTC" firstStartedPulling="2025-10-06 09:34:22.882407708 +0000 UTC m=+4319.711722922" lastFinishedPulling="2025-10-06 09:34:25.354860701 +0000 UTC m=+4322.184175915" observedRunningTime="2025-10-06 09:34:25.938539721 +0000 UTC m=+4322.767854955" watchObservedRunningTime="2025-10-06 09:34:25.946286802 +0000 UTC m=+4322.775602016" Oct 06 09:34:31 crc kubenswrapper[4755]: I1006 09:34:31.338711 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:31 crc kubenswrapper[4755]: I1006 09:34:31.339357 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:31 crc kubenswrapper[4755]: I1006 09:34:31.387311 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:32 crc kubenswrapper[4755]: I1006 09:34:32.033295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:32 crc kubenswrapper[4755]: I1006 09:34:32.078322 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:33 crc kubenswrapper[4755]: I1006 09:34:33.997686 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gn9ml" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="registry-server" containerID="cri-o://c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d" gracePeriod=2 Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.526695 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.600875 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities\") pod \"dfbdb72b-c434-4522-9da4-81be126ba2e6\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.600971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content\") pod \"dfbdb72b-c434-4522-9da4-81be126ba2e6\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.601121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4r64\" (UniqueName: \"kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64\") pod \"dfbdb72b-c434-4522-9da4-81be126ba2e6\" (UID: \"dfbdb72b-c434-4522-9da4-81be126ba2e6\") " Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.602618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities" (OuterVolumeSpecName: "utilities") pod "dfbdb72b-c434-4522-9da4-81be126ba2e6" (UID: "dfbdb72b-c434-4522-9da4-81be126ba2e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.611855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64" (OuterVolumeSpecName: "kube-api-access-v4r64") pod "dfbdb72b-c434-4522-9da4-81be126ba2e6" (UID: "dfbdb72b-c434-4522-9da4-81be126ba2e6"). InnerVolumeSpecName "kube-api-access-v4r64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.704963 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:34:34 crc kubenswrapper[4755]: I1006 09:34:34.705023 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4r64\" (UniqueName: \"kubernetes.io/projected/dfbdb72b-c434-4522-9da4-81be126ba2e6-kube-api-access-v4r64\") on node \"crc\" DevicePath \"\"" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.011423 4755 generic.go:334] "Generic (PLEG): container finished" podID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerID="c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d" exitCode=0 Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.011792 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerDied","Data":"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d"} Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.011831 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gn9ml" event={"ID":"dfbdb72b-c434-4522-9da4-81be126ba2e6","Type":"ContainerDied","Data":"5988f87123e232dc5d9a97344cf79dd3ded26f43c291fceebd8d46ce2398c7aa"} Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.011858 4755 scope.go:117] "RemoveContainer" containerID="c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.012032 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gn9ml" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.042092 4755 scope.go:117] "RemoveContainer" containerID="3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.061948 4755 scope.go:117] "RemoveContainer" containerID="0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.140531 4755 scope.go:117] "RemoveContainer" containerID="c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d" Oct 06 09:34:35 crc kubenswrapper[4755]: E1006 09:34:35.140994 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d\": container with ID starting with c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d not found: ID does not exist" containerID="c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.141052 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d"} err="failed to get container status \"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d\": rpc error: code = NotFound desc = could not find container \"c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d\": container with ID starting with c1c8f2cffc01e147d8938f72ccdaa74be3ab27ecfc647b1e77b1ba3579aca94d not found: ID does not exist" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.141089 4755 scope.go:117] "RemoveContainer" containerID="3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a" Oct 06 09:34:35 crc kubenswrapper[4755]: E1006 09:34:35.141349 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a\": container with ID starting with 3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a not found: ID does not exist" containerID="3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.141389 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a"} err="failed to get container status \"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a\": rpc error: code = NotFound desc = could not find container \"3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a\": container with ID starting with 3cd099a779e043a58ab598c73d2766eb397e2302e0d49b1fad9b4bf9a6a1969a not found: ID does not exist" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.141413 4755 scope.go:117] "RemoveContainer" containerID="0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609" Oct 06 09:34:35 crc kubenswrapper[4755]: E1006 09:34:35.141730 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609\": container with ID starting with 0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609 not found: ID does not exist" containerID="0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.141779 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609"} err="failed to get container status \"0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609\": rpc error: code = NotFound desc = could not find container \"0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609\": container with ID starting with 0acf1b84a8abd01401a0151ef97d77c678af1ed1157f9b3ec87c3e289f2ff609 not found: ID does not exist" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.691915 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfbdb72b-c434-4522-9da4-81be126ba2e6" (UID: "dfbdb72b-c434-4522-9da4-81be126ba2e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.727667 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbdb72b-c434-4522-9da4-81be126ba2e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.938181 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:35 crc kubenswrapper[4755]: I1006 09:34:35.946045 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gn9ml"] Oct 06 09:34:37 crc kubenswrapper[4755]: I1006 09:34:37.891577 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" path="/var/lib/kubelet/pods/dfbdb72b-c434-4522-9da4-81be126ba2e6/volumes" Oct 06 09:34:48 crc kubenswrapper[4755]: I1006 09:34:48.911969 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:34:48 crc kubenswrapper[4755]: I1006 09:34:48.912740 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:34:48 crc kubenswrapper[4755]: I1006 09:34:48.912789 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:34:48 crc kubenswrapper[4755]: I1006 09:34:48.913533 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:34:48 crc kubenswrapper[4755]: I1006 09:34:48.913608 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468" gracePeriod=600 Oct 06 09:34:49 crc kubenswrapper[4755]: E1006 09:34:49.031236 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854f4c9e_3c8a_47bb_9427_bb5bfc5691d7.slice/crio-24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:34:49 crc kubenswrapper[4755]: I1006 09:34:49.134636 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468" exitCode=0 Oct 06 09:34:49 crc kubenswrapper[4755]: I1006 09:34:49.134964 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468"} Oct 06 09:34:49 crc kubenswrapper[4755]: I1006 09:34:49.135003 4755 scope.go:117] "RemoveContainer" containerID="0c2ab641f3ba049f417966866f5835c5a5553a68f3694c5e4975ea1bc2d19e8e" Oct 06 09:34:50 crc kubenswrapper[4755]: I1006 09:34:50.145234 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd"} Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.515625 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:44 crc kubenswrapper[4755]: E1006 09:35:44.517076 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="extract-content" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.517096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="extract-content" Oct 06 09:35:44 crc kubenswrapper[4755]: E1006 09:35:44.517113 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="registry-server" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.517120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="registry-server" Oct 06 09:35:44 crc kubenswrapper[4755]: E1006 09:35:44.517164 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="extract-utilities" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.517173 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="extract-utilities" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.517373 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbdb72b-c434-4522-9da4-81be126ba2e6" containerName="registry-server" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.518725 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.532773 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.653338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.653407 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.653450 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wcl\" (UniqueName: \"kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.755259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.755326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.755363 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wcl\" (UniqueName: \"kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.755900 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.755918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.775309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wcl\" (UniqueName: \"kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl\") pod \"redhat-marketplace-dgsrz\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:44 crc kubenswrapper[4755]: I1006 09:35:44.855812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:45 crc kubenswrapper[4755]: I1006 09:35:45.372775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:45 crc kubenswrapper[4755]: W1006 09:35:45.389483 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783ae4c5_6d06_45bd_b779_de99a55a323a.slice/crio-f1065743dc505584b89e197a4f1929393467e91e77ee0a3cc7528d9dcd15965f WatchSource:0}: Error finding container f1065743dc505584b89e197a4f1929393467e91e77ee0a3cc7528d9dcd15965f: Status 404 returned error can't find the container with id f1065743dc505584b89e197a4f1929393467e91e77ee0a3cc7528d9dcd15965f Oct 06 09:35:45 crc kubenswrapper[4755]: I1006 09:35:45.655282 4755 generic.go:334] "Generic (PLEG): container finished" podID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerID="a6e8b604c587e7091c62f1e3dc8119d477affbdcc7a140eb7920d7c55b01ff50" exitCode=0 Oct 06 09:35:45 crc kubenswrapper[4755]: I1006 09:35:45.655382 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerDied","Data":"a6e8b604c587e7091c62f1e3dc8119d477affbdcc7a140eb7920d7c55b01ff50"} Oct 06 09:35:45 crc kubenswrapper[4755]: I1006 09:35:45.655675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerStarted","Data":"f1065743dc505584b89e197a4f1929393467e91e77ee0a3cc7528d9dcd15965f"} Oct 06 09:35:47 crc kubenswrapper[4755]: I1006 09:35:47.674915 4755 generic.go:334] "Generic (PLEG): container finished" podID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerID="36ad8fa1b09153e162cfdb0f7ff31ccc447f3ebc0d8ed2edaeb82705dcfaf30d" exitCode=0 Oct 06 09:35:47 crc kubenswrapper[4755]: I1006 09:35:47.675024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerDied","Data":"36ad8fa1b09153e162cfdb0f7ff31ccc447f3ebc0d8ed2edaeb82705dcfaf30d"} Oct 06 09:35:48 crc kubenswrapper[4755]: I1006 09:35:48.700818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerStarted","Data":"84db1f6d4ffcb751f1bdccfffa814a0433282ea0dea843f1f4740a8a1110955f"} Oct 06 09:35:48 crc kubenswrapper[4755]: I1006 09:35:48.720758 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dgsrz" podStartSLOduration=2.232820431 podStartE2EDuration="4.720736794s" podCreationTimestamp="2025-10-06 09:35:44 +0000 UTC" firstStartedPulling="2025-10-06 09:35:45.656784518 +0000 UTC m=+4402.486099732" lastFinishedPulling="2025-10-06 09:35:48.144700881 +0000 UTC m=+4404.974016095" observedRunningTime="2025-10-06 09:35:48.717929765 +0000 UTC m=+4405.547244989" watchObservedRunningTime="2025-10-06 09:35:48.720736794 +0000 UTC m=+4405.550052018" Oct 06 09:35:54 crc kubenswrapper[4755]: I1006 09:35:54.856695 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:54 crc kubenswrapper[4755]: I1006 09:35:54.857437 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:54 crc kubenswrapper[4755]: I1006 09:35:54.907726 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:55 crc kubenswrapper[4755]: I1006 09:35:55.816070 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:55 crc kubenswrapper[4755]: I1006 09:35:55.860963 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:57 crc kubenswrapper[4755]: I1006 09:35:57.780084 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dgsrz" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="registry-server" containerID="cri-o://84db1f6d4ffcb751f1bdccfffa814a0433282ea0dea843f1f4740a8a1110955f" gracePeriod=2 Oct 06 09:35:58 crc kubenswrapper[4755]: I1006 09:35:58.791916 4755 generic.go:334] "Generic (PLEG): container finished" podID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerID="84db1f6d4ffcb751f1bdccfffa814a0433282ea0dea843f1f4740a8a1110955f" exitCode=0 Oct 06 09:35:58 crc kubenswrapper[4755]: I1006 09:35:58.792139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerDied","Data":"84db1f6d4ffcb751f1bdccfffa814a0433282ea0dea843f1f4740a8a1110955f"} Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.068485 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.181458 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities\") pod \"783ae4c5-6d06-45bd-b779-de99a55a323a\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.181681 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content\") pod \"783ae4c5-6d06-45bd-b779-de99a55a323a\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.181767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7wcl\" (UniqueName: \"kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl\") pod \"783ae4c5-6d06-45bd-b779-de99a55a323a\" (UID: \"783ae4c5-6d06-45bd-b779-de99a55a323a\") " Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.183316 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities" (OuterVolumeSpecName: "utilities") pod "783ae4c5-6d06-45bd-b779-de99a55a323a" (UID: "783ae4c5-6d06-45bd-b779-de99a55a323a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.190838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl" (OuterVolumeSpecName: "kube-api-access-v7wcl") pod "783ae4c5-6d06-45bd-b779-de99a55a323a" (UID: "783ae4c5-6d06-45bd-b779-de99a55a323a"). InnerVolumeSpecName "kube-api-access-v7wcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.196467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "783ae4c5-6d06-45bd-b779-de99a55a323a" (UID: "783ae4c5-6d06-45bd-b779-de99a55a323a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.286599 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.286645 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783ae4c5-6d06-45bd-b779-de99a55a323a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.286659 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7wcl\" (UniqueName: \"kubernetes.io/projected/783ae4c5-6d06-45bd-b779-de99a55a323a-kube-api-access-v7wcl\") on node \"crc\" DevicePath \"\"" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.803938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgsrz" event={"ID":"783ae4c5-6d06-45bd-b779-de99a55a323a","Type":"ContainerDied","Data":"f1065743dc505584b89e197a4f1929393467e91e77ee0a3cc7528d9dcd15965f"} Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.804003 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgsrz" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.804260 4755 scope.go:117] "RemoveContainer" containerID="84db1f6d4ffcb751f1bdccfffa814a0433282ea0dea843f1f4740a8a1110955f" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.822892 4755 scope.go:117] "RemoveContainer" containerID="36ad8fa1b09153e162cfdb0f7ff31ccc447f3ebc0d8ed2edaeb82705dcfaf30d" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.843928 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.850810 4755 scope.go:117] "RemoveContainer" containerID="a6e8b604c587e7091c62f1e3dc8119d477affbdcc7a140eb7920d7c55b01ff50" Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.854335 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgsrz"] Oct 06 09:35:59 crc kubenswrapper[4755]: I1006 09:35:59.891973 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" path="/var/lib/kubelet/pods/783ae4c5-6d06-45bd-b779-de99a55a323a/volumes" Oct 06 09:37:18 crc kubenswrapper[4755]: I1006 09:37:18.912485 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:37:18 crc kubenswrapper[4755]: I1006 09:37:18.913219 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:37:42 crc kubenswrapper[4755]: I1006 09:37:42.868849 4755 scope.go:117] "RemoveContainer" containerID="56f99d7950e33f544cc60cd3b0bd2740fb3b383d95401af5d02ffa7f459c87ab" Oct 06 09:37:42 crc kubenswrapper[4755]: I1006 09:37:42.906900 4755 scope.go:117] "RemoveContainer" containerID="ed3a014dec3370fd090fcab596b780da47cb2e6a294e2568314b58e59196c216" Oct 06 09:37:42 crc kubenswrapper[4755]: I1006 09:37:42.956983 4755 scope.go:117] "RemoveContainer" containerID="253cadd2f3bdb93a7c31dee8a2e6f1c27e2826aee4f7f61dad395d0708c0f0bb" Oct 06 09:37:48 crc kubenswrapper[4755]: I1006 09:37:48.912514 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:37:48 crc kubenswrapper[4755]: I1006 09:37:48.913127 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:38:18 crc kubenswrapper[4755]: I1006 09:38:18.912059 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:38:18 crc kubenswrapper[4755]: I1006 09:38:18.912653 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:38:18 crc kubenswrapper[4755]: I1006 09:38:18.912702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:38:18 crc kubenswrapper[4755]: I1006 09:38:18.913518 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:38:18 crc kubenswrapper[4755]: I1006 09:38:18.913592 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" gracePeriod=600 Oct 06 09:38:19 crc kubenswrapper[4755]: E1006 09:38:19.037238 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:38:19 crc kubenswrapper[4755]: I1006 09:38:19.118917 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" exitCode=0 Oct 06 09:38:19 crc kubenswrapper[4755]: I1006 09:38:19.118977 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd"} Oct 06 09:38:19 crc kubenswrapper[4755]: I1006 09:38:19.119019 4755 scope.go:117] "RemoveContainer" containerID="24ad9e11218b962186cb7a3a7b929e040c198ce315f540bca755ae2923311468" Oct 06 09:38:19 crc kubenswrapper[4755]: I1006 09:38:19.119908 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:38:19 crc kubenswrapper[4755]: E1006 09:38:19.120232 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:38:32 crc kubenswrapper[4755]: I1006 09:38:32.878802 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:38:32 crc kubenswrapper[4755]: E1006 09:38:32.879591 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:38:43 crc kubenswrapper[4755]: I1006 09:38:43.885471 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:38:43 crc kubenswrapper[4755]: E1006 09:38:43.886300 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:38:55 crc kubenswrapper[4755]: I1006 09:38:55.880181 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:38:55 crc kubenswrapper[4755]: E1006 09:38:55.881177 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:39:08 crc kubenswrapper[4755]: I1006 09:39:08.879029 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:39:08 crc kubenswrapper[4755]: E1006 09:39:08.879825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:39:22 crc kubenswrapper[4755]: I1006 09:39:22.880014 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:39:22 crc kubenswrapper[4755]: E1006 09:39:22.881043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:39:34 crc kubenswrapper[4755]: I1006 09:39:34.878883 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:39:34 crc kubenswrapper[4755]: E1006 09:39:34.879505 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:39:46 crc kubenswrapper[4755]: I1006 09:39:46.879472 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:39:46 crc kubenswrapper[4755]: E1006 09:39:46.881249 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:40:01 crc kubenswrapper[4755]: I1006 09:40:01.879806 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:40:01 crc kubenswrapper[4755]: E1006 09:40:01.881118 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:40:16 crc kubenswrapper[4755]: I1006 09:40:16.879683 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:40:16 crc kubenswrapper[4755]: E1006 09:40:16.880677 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:40:31 crc kubenswrapper[4755]: I1006 09:40:31.878687 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:40:31 crc kubenswrapper[4755]: E1006 09:40:31.879949 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:40:44 crc kubenswrapper[4755]: I1006 09:40:44.879663 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:40:44 crc kubenswrapper[4755]: E1006 09:40:44.880972 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:40:58 crc kubenswrapper[4755]: I1006 09:40:58.879590 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:40:58 crc kubenswrapper[4755]: E1006 09:40:58.881396 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:41:09 crc kubenswrapper[4755]: I1006 09:41:09.879068 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:41:09 crc kubenswrapper[4755]: E1006 09:41:09.880003 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:41:20 crc kubenswrapper[4755]: I1006 09:41:20.879806 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:41:20 crc kubenswrapper[4755]: E1006 09:41:20.880929 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:41:35 crc kubenswrapper[4755]: I1006 09:41:35.879568 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:41:35 crc kubenswrapper[4755]: E1006 09:41:35.880832 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.109370 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:43 crc kubenswrapper[4755]: E1006 09:41:43.111283 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="extract-utilities" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.111306 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="extract-utilities" Oct 06 09:41:43 crc kubenswrapper[4755]: E1006 09:41:43.111341 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="registry-server" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.111352 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="registry-server" Oct 06 09:41:43 crc kubenswrapper[4755]: E1006 09:41:43.111378 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="extract-content" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.111391 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="extract-content" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.111766 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="783ae4c5-6d06-45bd-b779-de99a55a323a" containerName="registry-server" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.114191 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.143179 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.143408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbgs4\" (UniqueName: \"kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.143788 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.143862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.246854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.246944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbgs4\" (UniqueName: \"kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.247468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.247645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.247949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.274948 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbgs4\" (UniqueName: \"kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4\") pod \"certified-operators-qwglh\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.443968 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.707952 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.710240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.728055 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.758337 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnvd\" (UniqueName: \"kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.758411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.759697 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.861676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.861749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.861867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnvd\" (UniqueName: \"kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.862297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.862343 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:43 crc kubenswrapper[4755]: I1006 09:41:43.913044 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnvd\" (UniqueName: \"kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd\") pod \"community-operators-cjdk8\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:44 crc kubenswrapper[4755]: I1006 09:41:44.066155 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:44 crc kubenswrapper[4755]: I1006 09:41:44.083259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:44 crc kubenswrapper[4755]: I1006 09:41:44.150936 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerStarted","Data":"a13e2d6b84c797799094a36c0bd2b402e4aee84493f8b56780f633e9b10294eb"} Oct 06 09:41:44 crc kubenswrapper[4755]: I1006 09:41:44.690212 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.164500 4755 generic.go:334] "Generic (PLEG): container finished" podID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerID="3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983" exitCode=0 Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.164597 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerDied","Data":"3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983"} Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.166252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerStarted","Data":"8caf192352f540a5c66bac6617489071c035208ce113e53ac5b35f58e8007ebf"} Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.168936 4755 generic.go:334] "Generic (PLEG): container finished" podID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerID="02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516" exitCode=0 Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.168980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerDied","Data":"02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516"} Oct 06 09:41:45 crc kubenswrapper[4755]: I1006 09:41:45.169544 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:41:47 crc kubenswrapper[4755]: I1006 09:41:47.185592 4755 generic.go:334] "Generic (PLEG): container finished" podID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerID="64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a" exitCode=0 Oct 06 09:41:47 crc kubenswrapper[4755]: I1006 09:41:47.185744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerDied","Data":"64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a"} Oct 06 09:41:47 crc kubenswrapper[4755]: I1006 09:41:47.188800 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerStarted","Data":"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd"} Oct 06 09:41:48 crc kubenswrapper[4755]: I1006 09:41:48.201955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerStarted","Data":"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77"} Oct 06 09:41:48 crc kubenswrapper[4755]: I1006 09:41:48.206526 4755 generic.go:334] "Generic (PLEG): container finished" podID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerID="c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd" exitCode=0 Oct 06 09:41:48 crc kubenswrapper[4755]: I1006 09:41:48.206589 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerDied","Data":"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd"} Oct 06 09:41:48 crc kubenswrapper[4755]: I1006 09:41:48.234114 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cjdk8" podStartSLOduration=2.800102579 podStartE2EDuration="5.234097598s" podCreationTimestamp="2025-10-06 09:41:43 +0000 UTC" firstStartedPulling="2025-10-06 09:41:45.169300545 +0000 UTC m=+4761.998615759" lastFinishedPulling="2025-10-06 09:41:47.603295564 +0000 UTC m=+4764.432610778" observedRunningTime="2025-10-06 09:41:48.228036529 +0000 UTC m=+4765.057351753" watchObservedRunningTime="2025-10-06 09:41:48.234097598 +0000 UTC m=+4765.063412812" Oct 06 09:41:48 crc kubenswrapper[4755]: I1006 09:41:48.879091 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:41:48 crc kubenswrapper[4755]: E1006 09:41:48.879813 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:41:49 crc kubenswrapper[4755]: I1006 09:41:49.225611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerStarted","Data":"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b"} Oct 06 09:41:49 crc kubenswrapper[4755]: I1006 09:41:49.252661 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qwglh" podStartSLOduration=2.800588409 podStartE2EDuration="6.252643219s" podCreationTimestamp="2025-10-06 09:41:43 +0000 UTC" firstStartedPulling="2025-10-06 09:41:45.171101589 +0000 UTC m=+4762.000416803" lastFinishedPulling="2025-10-06 09:41:48.623156389 +0000 UTC m=+4765.452471613" observedRunningTime="2025-10-06 09:41:49.24862232 +0000 UTC m=+4766.077937554" watchObservedRunningTime="2025-10-06 09:41:49.252643219 +0000 UTC m=+4766.081958433" Oct 06 09:41:53 crc kubenswrapper[4755]: I1006 09:41:53.444515 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:53 crc kubenswrapper[4755]: I1006 09:41:53.445123 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:53 crc kubenswrapper[4755]: I1006 09:41:53.501256 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.083784 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.084369 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.139054 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.350598 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.352885 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:54 crc kubenswrapper[4755]: I1006 09:41:54.893396 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.309349 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cjdk8" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="registry-server" containerID="cri-o://4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77" gracePeriod=2 Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.686496 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.687072 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qwglh" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="registry-server" containerID="cri-o://e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b" gracePeriod=2 Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.880606 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.976832 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content\") pod \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.977083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbnvd\" (UniqueName: \"kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd\") pod \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.977331 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities\") pod \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\" (UID: \"2075ea2e-fd03-498b-a3f0-eb5bd3091194\") " Oct 06 09:41:56 crc kubenswrapper[4755]: I1006 09:41:56.980408 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities" (OuterVolumeSpecName: "utilities") pod "2075ea2e-fd03-498b-a3f0-eb5bd3091194" (UID: "2075ea2e-fd03-498b-a3f0-eb5bd3091194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.003949 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd" (OuterVolumeSpecName: "kube-api-access-cbnvd") pod "2075ea2e-fd03-498b-a3f0-eb5bd3091194" (UID: "2075ea2e-fd03-498b-a3f0-eb5bd3091194"). InnerVolumeSpecName "kube-api-access-cbnvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.040369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2075ea2e-fd03-498b-a3f0-eb5bd3091194" (UID: "2075ea2e-fd03-498b-a3f0-eb5bd3091194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.079860 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbnvd\" (UniqueName: \"kubernetes.io/projected/2075ea2e-fd03-498b-a3f0-eb5bd3091194-kube-api-access-cbnvd\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.079904 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.079919 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075ea2e-fd03-498b-a3f0-eb5bd3091194-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.094832 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.181077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content\") pod \"a8007de1-344a-4a29-830c-6fc2cb8b8364\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.181793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbgs4\" (UniqueName: \"kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4\") pod \"a8007de1-344a-4a29-830c-6fc2cb8b8364\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.181849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities\") pod \"a8007de1-344a-4a29-830c-6fc2cb8b8364\" (UID: \"a8007de1-344a-4a29-830c-6fc2cb8b8364\") " Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.182976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities" (OuterVolumeSpecName: "utilities") pod "a8007de1-344a-4a29-830c-6fc2cb8b8364" (UID: "a8007de1-344a-4a29-830c-6fc2cb8b8364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.185069 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4" (OuterVolumeSpecName: "kube-api-access-zbgs4") pod "a8007de1-344a-4a29-830c-6fc2cb8b8364" (UID: "a8007de1-344a-4a29-830c-6fc2cb8b8364"). InnerVolumeSpecName "kube-api-access-zbgs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.233929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8007de1-344a-4a29-830c-6fc2cb8b8364" (UID: "a8007de1-344a-4a29-830c-6fc2cb8b8364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.284602 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbgs4\" (UniqueName: \"kubernetes.io/projected/a8007de1-344a-4a29-830c-6fc2cb8b8364-kube-api-access-zbgs4\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.284640 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.284663 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8007de1-344a-4a29-830c-6fc2cb8b8364-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.350906 4755 generic.go:334] "Generic (PLEG): container finished" podID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerID="4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77" exitCode=0 Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.351046 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerDied","Data":"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77"} Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.351175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjdk8" event={"ID":"2075ea2e-fd03-498b-a3f0-eb5bd3091194","Type":"ContainerDied","Data":"8caf192352f540a5c66bac6617489071c035208ce113e53ac5b35f58e8007ebf"} Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.351236 4755 scope.go:117] "RemoveContainer" containerID="4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.351323 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjdk8" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.361756 4755 generic.go:334] "Generic (PLEG): container finished" podID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerID="e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b" exitCode=0 Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.361815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerDied","Data":"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b"} Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.361882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwglh" event={"ID":"a8007de1-344a-4a29-830c-6fc2cb8b8364","Type":"ContainerDied","Data":"a13e2d6b84c797799094a36c0bd2b402e4aee84493f8b56780f633e9b10294eb"} Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.361993 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwglh" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.387496 4755 scope.go:117] "RemoveContainer" containerID="64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.398262 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.409190 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cjdk8"] Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.417789 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.421761 4755 scope.go:117] "RemoveContainer" containerID="3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.447654 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qwglh"] Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.474099 4755 scope.go:117] "RemoveContainer" containerID="4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.474494 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77\": container with ID starting with 4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77 not found: ID does not exist" containerID="4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.474536 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77"} err="failed to get container status \"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77\": rpc error: code = NotFound desc = could not find container \"4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77\": container with ID starting with 4a024853684ed0d15e259b65f6c83ea0d746a99f7af466c8be8b2fdc6a0feb77 not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.474606 4755 scope.go:117] "RemoveContainer" containerID="64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.474950 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a\": container with ID starting with 64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a not found: ID does not exist" containerID="64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.474979 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a"} err="failed to get container status \"64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a\": rpc error: code = NotFound desc = could not find container \"64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a\": container with ID starting with 64a3baabe47fe34c0f65760a9d2756dcab74ef6b3cce533873a8fdca46571c3a not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.475015 4755 scope.go:117] "RemoveContainer" containerID="3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.475291 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983\": container with ID starting with 3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983 not found: ID does not exist" containerID="3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.475315 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983"} err="failed to get container status \"3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983\": rpc error: code = NotFound desc = could not find container \"3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983\": container with ID starting with 3ba4137b06d6546098046654b5caa309cb939da7ff34f2e6fd4f70f5d1bbc983 not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.475329 4755 scope.go:117] "RemoveContainer" containerID="e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.527535 4755 scope.go:117] "RemoveContainer" containerID="c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.554222 4755 scope.go:117] "RemoveContainer" containerID="02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.590299 4755 scope.go:117] "RemoveContainer" containerID="e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.591993 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b\": container with ID starting with e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b not found: ID does not exist" containerID="e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.592030 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b"} err="failed to get container status \"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b\": rpc error: code = NotFound desc = could not find container \"e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b\": container with ID starting with e50f0989f752e5e961d0bd8237765a205332e0dc5e1e10a1bf5e66b5c709e70b not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.592061 4755 scope.go:117] "RemoveContainer" containerID="c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.592550 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd\": container with ID starting with c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd not found: ID does not exist" containerID="c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.592674 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd"} err="failed to get container status \"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd\": rpc error: code = NotFound desc = could not find container \"c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd\": container with ID starting with c84672a219bc2150e4337d40cba4efe9fd5cdc9d04758d0e297c7f9822d6cbcd not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.592692 4755 scope.go:117] "RemoveContainer" containerID="02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516" Oct 06 09:41:57 crc kubenswrapper[4755]: E1006 09:41:57.592937 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516\": container with ID starting with 02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516 not found: ID does not exist" containerID="02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.592959 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516"} err="failed to get container status \"02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516\": rpc error: code = NotFound desc = could not find container \"02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516\": container with ID starting with 02e20ab2c2131aa5a76a4bb270e3e7f8cce9977e0dee0dc7642d83b08f231516 not found: ID does not exist" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.892873 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" path="/var/lib/kubelet/pods/2075ea2e-fd03-498b-a3f0-eb5bd3091194/volumes" Oct 06 09:41:57 crc kubenswrapper[4755]: I1006 09:41:57.894478 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" path="/var/lib/kubelet/pods/a8007de1-344a-4a29-830c-6fc2cb8b8364/volumes" Oct 06 09:41:59 crc kubenswrapper[4755]: I1006 09:41:59.880338 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:41:59 crc kubenswrapper[4755]: E1006 09:41:59.881096 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:42:10 crc kubenswrapper[4755]: I1006 09:42:10.878761 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:42:10 crc kubenswrapper[4755]: E1006 09:42:10.879550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:42:23 crc kubenswrapper[4755]: I1006 09:42:23.879405 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:42:23 crc kubenswrapper[4755]: E1006 09:42:23.881139 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:42:36 crc kubenswrapper[4755]: I1006 09:42:36.880046 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:42:36 crc kubenswrapper[4755]: E1006 09:42:36.881194 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:42:47 crc kubenswrapper[4755]: I1006 09:42:47.879157 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:42:47 crc kubenswrapper[4755]: E1006 09:42:47.879981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:42:59 crc kubenswrapper[4755]: I1006 09:42:59.879492 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:42:59 crc kubenswrapper[4755]: E1006 09:42:59.880501 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:43:11 crc kubenswrapper[4755]: I1006 09:43:11.880057 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:43:11 crc kubenswrapper[4755]: E1006 09:43:11.882698 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:43:25 crc kubenswrapper[4755]: I1006 09:43:25.880178 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:43:26 crc kubenswrapper[4755]: I1006 09:43:26.269553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9"} Oct 06 09:43:54 crc kubenswrapper[4755]: I1006 09:43:54.601740 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" containerID="de8fd9e6a3d31077d285d1c560ab461055c4d19edb2a5a64dee6f19ce074861a" exitCode=1 Oct 06 09:43:54 crc kubenswrapper[4755]: I1006 09:43:54.601983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d5b996de-cff3-4a46-bfd0-25e7833f58e8","Type":"ContainerDied","Data":"de8fd9e6a3d31077d285d1c560ab461055c4d19edb2a5a64dee6f19ce074861a"} Oct 06 09:43:55 crc kubenswrapper[4755]: I1006 09:43:55.969774 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.068704 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.069087 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.069776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.069940 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.070072 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.070117 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.070204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jqmx\" (UniqueName: \"kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.070336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.070408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\" (UID: \"d5b996de-cff3-4a46-bfd0-25e7833f58e8\") " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.072809 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.073254 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data" (OuterVolumeSpecName: "config-data") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.075462 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.077780 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.086600 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx" (OuterVolumeSpecName: "kube-api-access-4jqmx") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "kube-api-access-4jqmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.113313 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.121540 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.124937 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.125991 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d5b996de-cff3-4a46-bfd0-25e7833f58e8" (UID: "d5b996de-cff3-4a46-bfd0-25e7833f58e8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173072 4755 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173137 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173152 4755 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173161 4755 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d5b996de-cff3-4a46-bfd0-25e7833f58e8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173171 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173181 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173189 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173197 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d5b996de-cff3-4a46-bfd0-25e7833f58e8-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.173206 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jqmx\" (UniqueName: \"kubernetes.io/projected/d5b996de-cff3-4a46-bfd0-25e7833f58e8-kube-api-access-4jqmx\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.192703 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.275200 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.624972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d5b996de-cff3-4a46-bfd0-25e7833f58e8","Type":"ContainerDied","Data":"a348eca4ce0eee37520be800016196aac2e5be8a2b911dab3b9e3479bd657a7d"} Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.625014 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a348eca4ce0eee37520be800016196aac2e5be8a2b911dab3b9e3479bd657a7d" Oct 06 09:43:56 crc kubenswrapper[4755]: I1006 09:43:56.625074 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.585128 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586278 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="extract-content" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586298 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="extract-content" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586315 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586323 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586346 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="extract-content" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586354 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="extract-content" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586372 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="extract-utilities" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586379 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="extract-utilities" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586403 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="extract-utilities" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586411 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="extract-utilities" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586433 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586442 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: E1006 09:44:00.586463 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" containerName="tempest-tests-tempest-tests-runner" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586473 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" containerName="tempest-tests-tempest-tests-runner" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586747 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2075ea2e-fd03-498b-a3f0-eb5bd3091194" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586771 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b996de-cff3-4a46-bfd0-25e7833f58e8" containerName="tempest-tests-tempest-tests-runner" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.586782 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8007de1-344a-4a29-830c-6fc2cb8b8364" containerName="registry-server" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.587812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.589882 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-twssm" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.595590 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.707880 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xst24\" (UniqueName: \"kubernetes.io/projected/391f42bd-301b-4593-bc2d-3955a5114d27-kube-api-access-xst24\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.708038 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.809837 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xst24\" (UniqueName: \"kubernetes.io/projected/391f42bd-301b-4593-bc2d-3955a5114d27-kube-api-access-xst24\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.809914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.810385 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.841834 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xst24\" (UniqueName: \"kubernetes.io/projected/391f42bd-301b-4593-bc2d-3955a5114d27-kube-api-access-xst24\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.853288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"391f42bd-301b-4593-bc2d-3955a5114d27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:00 crc kubenswrapper[4755]: I1006 09:44:00.927205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 09:44:01 crc kubenswrapper[4755]: I1006 09:44:01.382743 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 09:44:02 crc kubenswrapper[4755]: I1006 09:44:02.686167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"391f42bd-301b-4593-bc2d-3955a5114d27","Type":"ContainerStarted","Data":"7f37f490236d147b5051d60dfd5f50dea12abac958fd60d6292bee936d96737f"} Oct 06 09:44:03 crc kubenswrapper[4755]: I1006 09:44:03.697934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"391f42bd-301b-4593-bc2d-3955a5114d27","Type":"ContainerStarted","Data":"116a0bde9bd122235303a87b787c6316e5803fbcec7c3296b70f7c20246b3dc8"} Oct 06 09:44:03 crc kubenswrapper[4755]: I1006 09:44:03.728376 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.71621329 podStartE2EDuration="3.728351585s" podCreationTimestamp="2025-10-06 09:44:00 +0000 UTC" firstStartedPulling="2025-10-06 09:44:01.991206132 +0000 UTC m=+4898.820521376" lastFinishedPulling="2025-10-06 09:44:03.003344447 +0000 UTC m=+4899.832659671" observedRunningTime="2025-10-06 09:44:03.718074902 +0000 UTC m=+4900.547390186" watchObservedRunningTime="2025-10-06 09:44:03.728351585 +0000 UTC m=+4900.557666819" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.176875 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-srw69/must-gather-rqdkn"] Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.179137 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.180549 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-srw69"/"openshift-service-ca.crt" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.180723 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-srw69"/"default-dockercfg-dxgz5" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.181020 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-srw69"/"kube-root-ca.crt" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.191614 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-srw69/must-gather-rqdkn"] Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.338942 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.339151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdsrc\" (UniqueName: \"kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.440280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdsrc\" (UniqueName: \"kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.440370 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.440801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.460444 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdsrc\" (UniqueName: \"kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc\") pod \"must-gather-rqdkn\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:45 crc kubenswrapper[4755]: I1006 09:44:45.496258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:44:46 crc kubenswrapper[4755]: I1006 09:44:46.024249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-srw69/must-gather-rqdkn"] Oct 06 09:44:46 crc kubenswrapper[4755]: I1006 09:44:46.168327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/must-gather-rqdkn" event={"ID":"8410ff38-76bd-40d9-99e3-7a6e1d85c220","Type":"ContainerStarted","Data":"4609603a3431cf07caaa219f657ef8574192dcb69f61c530857da4d008ef0fa6"} Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.048615 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.051618 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.081802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.087376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.087422 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.087462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgkz\" (UniqueName: \"kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.189491 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.189590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.189642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgkz\" (UniqueName: \"kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.190406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.190452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.216554 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgkz\" (UniqueName: \"kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz\") pod \"redhat-operators-98pgf\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.236464 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/must-gather-rqdkn" event={"ID":"8410ff38-76bd-40d9-99e3-7a6e1d85c220","Type":"ContainerStarted","Data":"3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322"} Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.236518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/must-gather-rqdkn" event={"ID":"8410ff38-76bd-40d9-99e3-7a6e1d85c220","Type":"ContainerStarted","Data":"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f"} Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.268634 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-srw69/must-gather-rqdkn" podStartSLOduration=1.7900296390000001 podStartE2EDuration="7.268615313s" podCreationTimestamp="2025-10-06 09:44:45 +0000 UTC" firstStartedPulling="2025-10-06 09:44:46.032404939 +0000 UTC m=+4942.861720153" lastFinishedPulling="2025-10-06 09:44:51.510990613 +0000 UTC m=+4948.340305827" observedRunningTime="2025-10-06 09:44:52.265893266 +0000 UTC m=+4949.095208480" watchObservedRunningTime="2025-10-06 09:44:52.268615313 +0000 UTC m=+4949.097930527" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.371376 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:44:52 crc kubenswrapper[4755]: I1006 09:44:52.923629 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:44:54 crc kubenswrapper[4755]: I1006 09:44:54.264388 4755 generic.go:334] "Generic (PLEG): container finished" podID="4866ca28-1974-4a53-b993-5481258448b5" containerID="fb8dcbe8686b137b905ce2c9c400c2c98307dafcad754f8087778ed5734f6afc" exitCode=0 Oct 06 09:44:54 crc kubenswrapper[4755]: I1006 09:44:54.264890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerDied","Data":"fb8dcbe8686b137b905ce2c9c400c2c98307dafcad754f8087778ed5734f6afc"} Oct 06 09:44:54 crc kubenswrapper[4755]: I1006 09:44:54.264927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerStarted","Data":"ee937d16e79d5e848c8c49675659fa68eb170125e8f6f5cdb9b833c6d5a98c86"} Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.286795 4755 generic.go:334] "Generic (PLEG): container finished" podID="4866ca28-1974-4a53-b993-5481258448b5" containerID="065af4901bd0158b4e0b0212f713ce36fd3803a70b70dcfcd3305c0e52054f8c" exitCode=0 Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.286865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerDied","Data":"065af4901bd0158b4e0b0212f713ce36fd3803a70b70dcfcd3305c0e52054f8c"} Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.435155 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-srw69/crc-debug-2clfh"] Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.436541 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.500393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.500542 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvml\" (UniqueName: \"kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.602232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.602666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvml\" (UniqueName: \"kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.603172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.633075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvml\" (UniqueName: \"kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml\") pod \"crc-debug-2clfh\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:56 crc kubenswrapper[4755]: I1006 09:44:56.781419 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:44:57 crc kubenswrapper[4755]: I1006 09:44:57.313689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-2clfh" event={"ID":"8e653ab1-e205-4edc-b2ed-d74060995546","Type":"ContainerStarted","Data":"f493413f5d394860c0e3497d4456d70607c0c6492eb0b828062c6b3eb452a0a6"} Oct 06 09:44:57 crc kubenswrapper[4755]: I1006 09:44:57.338265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerStarted","Data":"8609245464ca28e99c07119a3bad73f2a2439c2638e6f85e4e51d79b663833e0"} Oct 06 09:44:57 crc kubenswrapper[4755]: I1006 09:44:57.394239 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98pgf" podStartSLOduration=2.9485281260000002 podStartE2EDuration="5.394221182s" podCreationTimestamp="2025-10-06 09:44:52 +0000 UTC" firstStartedPulling="2025-10-06 09:44:54.266704919 +0000 UTC m=+4951.096020153" lastFinishedPulling="2025-10-06 09:44:56.712397995 +0000 UTC m=+4953.541713209" observedRunningTime="2025-10-06 09:44:57.392822558 +0000 UTC m=+4954.222137782" watchObservedRunningTime="2025-10-06 09:44:57.394221182 +0000 UTC m=+4954.223536396" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.150549 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm"] Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.152701 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.155122 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.159703 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.166225 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm"] Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.281957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.282047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.282106 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zps\" (UniqueName: \"kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.388771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.389887 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.390495 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zps\" (UniqueName: \"kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.391658 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.402731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.410759 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zps\" (UniqueName: \"kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps\") pod \"collect-profiles-29329065-hr5bm\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:00 crc kubenswrapper[4755]: I1006 09:45:00.495492 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:01 crc kubenswrapper[4755]: I1006 09:45:01.076932 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm"] Oct 06 09:45:01 crc kubenswrapper[4755]: I1006 09:45:01.382300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" event={"ID":"ec107344-6f71-43d2-834f-d2678e380048","Type":"ContainerStarted","Data":"d1165db6ed2038e0f0cd47470dc3fd8e85c1e06ea0785dd0e55e6ac1cd75ba0b"} Oct 06 09:45:01 crc kubenswrapper[4755]: I1006 09:45:01.382750 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" event={"ID":"ec107344-6f71-43d2-834f-d2678e380048","Type":"ContainerStarted","Data":"9293acc7b4fc39028e627e5cfa781b0635f761ce25eaa5a184424bd831c8e1bb"} Oct 06 09:45:01 crc kubenswrapper[4755]: I1006 09:45:01.411191 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" podStartSLOduration=1.411166734 podStartE2EDuration="1.411166734s" podCreationTimestamp="2025-10-06 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:45:01.410939998 +0000 UTC m=+4958.240255222" watchObservedRunningTime="2025-10-06 09:45:01.411166734 +0000 UTC m=+4958.240481948" Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.373196 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.373782 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.394899 4755 generic.go:334] "Generic (PLEG): container finished" podID="ec107344-6f71-43d2-834f-d2678e380048" containerID="d1165db6ed2038e0f0cd47470dc3fd8e85c1e06ea0785dd0e55e6ac1cd75ba0b" exitCode=0 Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.394959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" event={"ID":"ec107344-6f71-43d2-834f-d2678e380048","Type":"ContainerDied","Data":"d1165db6ed2038e0f0cd47470dc3fd8e85c1e06ea0785dd0e55e6ac1cd75ba0b"} Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.462775 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.520418 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:02 crc kubenswrapper[4755]: I1006 09:45:02.701473 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:45:04 crc kubenswrapper[4755]: I1006 09:45:04.421190 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98pgf" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="registry-server" containerID="cri-o://8609245464ca28e99c07119a3bad73f2a2439c2638e6f85e4e51d79b663833e0" gracePeriod=2 Oct 06 09:45:05 crc kubenswrapper[4755]: I1006 09:45:05.432033 4755 generic.go:334] "Generic (PLEG): container finished" podID="4866ca28-1974-4a53-b993-5481258448b5" containerID="8609245464ca28e99c07119a3bad73f2a2439c2638e6f85e4e51d79b663833e0" exitCode=0 Oct 06 09:45:05 crc kubenswrapper[4755]: I1006 09:45:05.432121 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerDied","Data":"8609245464ca28e99c07119a3bad73f2a2439c2638e6f85e4e51d79b663833e0"} Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.098282 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.226817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume\") pod \"ec107344-6f71-43d2-834f-d2678e380048\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.227396 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume\") pod \"ec107344-6f71-43d2-834f-d2678e380048\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.227456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zps\" (UniqueName: \"kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps\") pod \"ec107344-6f71-43d2-834f-d2678e380048\" (UID: \"ec107344-6f71-43d2-834f-d2678e380048\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.228753 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec107344-6f71-43d2-834f-d2678e380048" (UID: "ec107344-6f71-43d2-834f-d2678e380048"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.234976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps" (OuterVolumeSpecName: "kube-api-access-46zps") pod "ec107344-6f71-43d2-834f-d2678e380048" (UID: "ec107344-6f71-43d2-834f-d2678e380048"). InnerVolumeSpecName "kube-api-access-46zps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.237804 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec107344-6f71-43d2-834f-d2678e380048" (UID: "ec107344-6f71-43d2-834f-d2678e380048"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.330868 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec107344-6f71-43d2-834f-d2678e380048-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.331228 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zps\" (UniqueName: \"kubernetes.io/projected/ec107344-6f71-43d2-834f-d2678e380048-kube-api-access-46zps\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.331238 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec107344-6f71-43d2-834f-d2678e380048-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.373720 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.433203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kgkz\" (UniqueName: \"kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz\") pod \"4866ca28-1974-4a53-b993-5481258448b5\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.433376 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content\") pod \"4866ca28-1974-4a53-b993-5481258448b5\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.433448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities\") pod \"4866ca28-1974-4a53-b993-5481258448b5\" (UID: \"4866ca28-1974-4a53-b993-5481258448b5\") " Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.434511 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities" (OuterVolumeSpecName: "utilities") pod "4866ca28-1974-4a53-b993-5481258448b5" (UID: "4866ca28-1974-4a53-b993-5481258448b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.437783 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz" (OuterVolumeSpecName: "kube-api-access-4kgkz") pod "4866ca28-1974-4a53-b993-5481258448b5" (UID: "4866ca28-1974-4a53-b993-5481258448b5"). InnerVolumeSpecName "kube-api-access-4kgkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.491404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-2clfh" event={"ID":"8e653ab1-e205-4edc-b2ed-d74060995546","Type":"ContainerStarted","Data":"8d0b503c7d49ad9a62a220c5949e1ebf891b129c41413353f782dc30e38cc6f0"} Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.496491 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98pgf" event={"ID":"4866ca28-1974-4a53-b993-5481258448b5","Type":"ContainerDied","Data":"ee937d16e79d5e848c8c49675659fa68eb170125e8f6f5cdb9b833c6d5a98c86"} Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.496601 4755 scope.go:117] "RemoveContainer" containerID="8609245464ca28e99c07119a3bad73f2a2439c2638e6f85e4e51d79b663833e0" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.496710 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98pgf" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.504187 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" event={"ID":"ec107344-6f71-43d2-834f-d2678e380048","Type":"ContainerDied","Data":"9293acc7b4fc39028e627e5cfa781b0635f761ce25eaa5a184424bd831c8e1bb"} Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.504235 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9293acc7b4fc39028e627e5cfa781b0635f761ce25eaa5a184424bd831c8e1bb" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.504273 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-hr5bm" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.517960 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-srw69/crc-debug-2clfh" podStartSLOduration=1.23669278 podStartE2EDuration="14.517936666s" podCreationTimestamp="2025-10-06 09:44:56 +0000 UTC" firstStartedPulling="2025-10-06 09:44:56.831075531 +0000 UTC m=+4953.660390755" lastFinishedPulling="2025-10-06 09:45:10.112319427 +0000 UTC m=+4966.941634641" observedRunningTime="2025-10-06 09:45:10.508889553 +0000 UTC m=+4967.338204787" watchObservedRunningTime="2025-10-06 09:45:10.517936666 +0000 UTC m=+4967.347251880" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.522882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4866ca28-1974-4a53-b993-5481258448b5" (UID: "4866ca28-1974-4a53-b993-5481258448b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.536669 4755 scope.go:117] "RemoveContainer" containerID="065af4901bd0158b4e0b0212f713ce36fd3803a70b70dcfcd3305c0e52054f8c" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.537944 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.537997 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4866ca28-1974-4a53-b993-5481258448b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.538010 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kgkz\" (UniqueName: \"kubernetes.io/projected/4866ca28-1974-4a53-b993-5481258448b5-kube-api-access-4kgkz\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.567874 4755 scope.go:117] "RemoveContainer" containerID="fb8dcbe8686b137b905ce2c9c400c2c98307dafcad754f8087778ed5734f6afc" Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.838326 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:45:10 crc kubenswrapper[4755]: I1006 09:45:10.848018 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98pgf"] Oct 06 09:45:11 crc kubenswrapper[4755]: I1006 09:45:11.180149 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w"] Oct 06 09:45:11 crc kubenswrapper[4755]: I1006 09:45:11.187648 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-nh86w"] Oct 06 09:45:11 crc kubenswrapper[4755]: I1006 09:45:11.893493 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4866ca28-1974-4a53-b993-5481258448b5" path="/var/lib/kubelet/pods/4866ca28-1974-4a53-b993-5481258448b5/volumes" Oct 06 09:45:11 crc kubenswrapper[4755]: I1006 09:45:11.895954 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74701c31-c0a3-4577-af42-b53554531f90" path="/var/lib/kubelet/pods/74701c31-c0a3-4577-af42-b53554531f90/volumes" Oct 06 09:45:43 crc kubenswrapper[4755]: I1006 09:45:43.205360 4755 scope.go:117] "RemoveContainer" containerID="0182e6d9781363873e34d02731c8fa0d3dbdd5b5f76053722da80ed2fbb5029d" Oct 06 09:45:48 crc kubenswrapper[4755]: I1006 09:45:48.912843 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:45:48 crc kubenswrapper[4755]: I1006 09:45:48.913268 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:46:06 crc kubenswrapper[4755]: I1006 09:46:06.529415 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7567ddf88-bwsd4_cd538b73-7f94-452c-a366-692369e490da/barbican-api/0.log" Oct 06 09:46:06 crc kubenswrapper[4755]: I1006 09:46:06.802667 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7567ddf88-bwsd4_cd538b73-7f94-452c-a366-692369e490da/barbican-api-log/0.log" Oct 06 09:46:06 crc kubenswrapper[4755]: I1006 09:46:06.973315 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c66599474-j7m6l_d155ddb9-1b21-4346-8858-2aba24321b8a/barbican-keystone-listener/0.log" Oct 06 09:46:07 crc kubenswrapper[4755]: I1006 09:46:07.186806 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c66599474-j7m6l_d155ddb9-1b21-4346-8858-2aba24321b8a/barbican-keystone-listener-log/0.log" Oct 06 09:46:07 crc kubenswrapper[4755]: I1006 09:46:07.349259 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6589865cf-k6ldq_c90c3ed0-a9cd-4589-ba3a-4c77e2163190/barbican-worker/0.log" Oct 06 09:46:07 crc kubenswrapper[4755]: I1006 09:46:07.364888 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6589865cf-k6ldq_c90c3ed0-a9cd-4589-ba3a-4c77e2163190/barbican-worker-log/0.log" Oct 06 09:46:07 crc kubenswrapper[4755]: I1006 09:46:07.568070 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dks7k_7e47d328-ee62-4e39-9912-8b08b04189c0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:07 crc kubenswrapper[4755]: I1006 09:46:07.853452 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e5a8fee4-7b04-4487-b9d4-718640b217e4/ceilometer-central-agent/0.log" Oct 06 09:46:08 crc kubenswrapper[4755]: I1006 09:46:08.055042 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e5a8fee4-7b04-4487-b9d4-718640b217e4/proxy-httpd/0.log" Oct 06 09:46:08 crc kubenswrapper[4755]: I1006 09:46:08.059116 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e5a8fee4-7b04-4487-b9d4-718640b217e4/ceilometer-notification-agent/0.log" Oct 06 09:46:08 crc kubenswrapper[4755]: I1006 09:46:08.268498 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e5a8fee4-7b04-4487-b9d4-718640b217e4/sg-core/0.log" Oct 06 09:46:08 crc kubenswrapper[4755]: I1006 09:46:08.970833 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-kl9q4_3d796d4f-17fe-44fc-a23b-8bb3a3dc71ac/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:09 crc kubenswrapper[4755]: I1006 09:46:09.168180 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-xpv6c_0f5c98c8-0bd9-4a85-b547-9c39183abe87/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:09 crc kubenswrapper[4755]: I1006 09:46:09.782929 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3a80bf3-70cf-465d-a429-caf75c375027/cinder-api/0.log" Oct 06 09:46:09 crc kubenswrapper[4755]: I1006 09:46:09.949691 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3a80bf3-70cf-465d-a429-caf75c375027/cinder-api-log/0.log" Oct 06 09:46:10 crc kubenswrapper[4755]: I1006 09:46:10.726878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_708a75eb-b436-40c0-b25c-8935f399cb4a/probe/0.log" Oct 06 09:46:10 crc kubenswrapper[4755]: I1006 09:46:10.900933 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fb717d16-7646-4b16-b1fb-fb2f580ceaf9/cinder-scheduler/0.log" Oct 06 09:46:11 crc kubenswrapper[4755]: I1006 09:46:11.016253 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_708a75eb-b436-40c0-b25c-8935f399cb4a/cinder-backup/0.log" Oct 06 09:46:11 crc kubenswrapper[4755]: I1006 09:46:11.019680 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fb717d16-7646-4b16-b1fb-fb2f580ceaf9/probe/0.log" Oct 06 09:46:11 crc kubenswrapper[4755]: I1006 09:46:11.282781 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e9a5baec-e335-4430-87ff-df995cc28434/probe/0.log" Oct 06 09:46:11 crc kubenswrapper[4755]: I1006 09:46:11.719219 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qc2zz_be08c4ee-663e-4a5e-b69a-2a6b7ee27e9b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:11 crc kubenswrapper[4755]: I1006 09:46:11.932399 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ck77l_5f68ca6b-cc42-460a-9490-b29b87004e16/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.150662 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-nl7nk_9fe5b7f4-6615-4f73-8116-51fcda3ef59e/init/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.337722 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-nl7nk_9fe5b7f4-6615-4f73-8116-51fcda3ef59e/init/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.437885 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-nl7nk_9fe5b7f4-6615-4f73-8116-51fcda3ef59e/dnsmasq-dns/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.656466 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_594fa22c-967e-4e3f-aa38-609978425183/glance-httpd/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.742080 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_594fa22c-967e-4e3f-aa38-609978425183/glance-log/0.log" Oct 06 09:46:12 crc kubenswrapper[4755]: I1006 09:46:12.969235 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f6b8f7e-5571-4367-ad0c-76688c6968d3/glance-httpd/0.log" Oct 06 09:46:13 crc kubenswrapper[4755]: I1006 09:46:13.100249 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f6b8f7e-5571-4367-ad0c-76688c6968d3/glance-log/0.log" Oct 06 09:46:13 crc kubenswrapper[4755]: I1006 09:46:13.428985 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5797d74dbd-6v4nj_3583b65a-632c-4988-81fa-d1ee08e8f258/horizon/0.log" Oct 06 09:46:13 crc kubenswrapper[4755]: I1006 09:46:13.689413 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ql54j_23de9f2b-8aa9-4d9c-905a-e317082fffc8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:13 crc kubenswrapper[4755]: I1006 09:46:13.749938 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5797d74dbd-6v4nj_3583b65a-632c-4988-81fa-d1ee08e8f258/horizon-log/0.log" Oct 06 09:46:13 crc kubenswrapper[4755]: I1006 09:46:13.926786 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rgzld_629dfd56-994c-4d9e-ba10-ecd79d750142/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:14 crc kubenswrapper[4755]: I1006 09:46:14.170668 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329021-jcqs8_48ed59bc-4f30-4c9f-9c3a-9628f0a5b314/keystone-cron/0.log" Oct 06 09:46:14 crc kubenswrapper[4755]: I1006 09:46:14.378045 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ecac61c8-6fb8-4eac-b34b-2589131fbeca/kube-state-metrics/0.log" Oct 06 09:46:14 crc kubenswrapper[4755]: I1006 09:46:14.667076 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s2r7l_89a25b96-fda5-4f63-bf99-550ba2c00701/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:14 crc kubenswrapper[4755]: I1006 09:46:14.767824 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67f564d7bf-cx47l_e5a3f24e-8de2-49da-aa12-9559bf1c0212/keystone-api/0.log" Oct 06 09:46:14 crc kubenswrapper[4755]: I1006 09:46:14.992745 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_2c1660df-89ac-403d-8343-195b26f04e5e/manila-api/0.log" Oct 06 09:46:15 crc kubenswrapper[4755]: I1006 09:46:15.008165 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_2c1660df-89ac-403d-8343-195b26f04e5e/manila-api-log/0.log" Oct 06 09:46:15 crc kubenswrapper[4755]: I1006 09:46:15.210909 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_9874a772-0d86-4316-9512-139b7b140518/probe/0.log" Oct 06 09:46:15 crc kubenswrapper[4755]: I1006 09:46:15.357675 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_9874a772-0d86-4316-9512-139b7b140518/manila-scheduler/0.log" Oct 06 09:46:15 crc kubenswrapper[4755]: I1006 09:46:15.514836 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_400eacf6-c350-4499-b95f-1e8ad5b09dab/manila-share/0.log" Oct 06 09:46:15 crc kubenswrapper[4755]: I1006 09:46:15.564069 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_400eacf6-c350-4499-b95f-1e8ad5b09dab/probe/0.log" Oct 06 09:46:16 crc kubenswrapper[4755]: I1006 09:46:16.139113 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854f5796f-f7f5s_7250ab29-650e-46db-ba50-5d20579db8b6/neutron-httpd/0.log" Oct 06 09:46:16 crc kubenswrapper[4755]: I1006 09:46:16.239538 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854f5796f-f7f5s_7250ab29-650e-46db-ba50-5d20579db8b6/neutron-api/0.log" Oct 06 09:46:16 crc kubenswrapper[4755]: I1006 09:46:16.616430 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-hsnc4_4c7b1eb8-1304-4cb2-aae7-e302a978c2c2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:17 crc kubenswrapper[4755]: I1006 09:46:17.775059 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb/nova-api-log/0.log" Oct 06 09:46:18 crc kubenswrapper[4755]: I1006 09:46:18.150960 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c3b9ddb-5ed7-46cd-a389-09f1ff73c8eb/nova-api-api/0.log" Oct 06 09:46:18 crc kubenswrapper[4755]: I1006 09:46:18.912797 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:46:18 crc kubenswrapper[4755]: I1006 09:46:18.912866 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:46:18 crc kubenswrapper[4755]: I1006 09:46:18.920330 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_68ef8a3b-ac54-4a60-bdf6-fc5d5d3fb66c/nova-cell0-conductor-conductor/0.log" Oct 06 09:46:19 crc kubenswrapper[4755]: I1006 09:46:19.136008 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_36bddbbc-1f14-40bd-921e-f5b85fc1b68e/nova-cell1-conductor-conductor/0.log" Oct 06 09:46:19 crc kubenswrapper[4755]: I1006 09:46:19.200622 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e9a5baec-e335-4430-87ff-df995cc28434/cinder-volume/0.log" Oct 06 09:46:19 crc kubenswrapper[4755]: I1006 09:46:19.441945 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b9450bb6-dd0a-4168-b34d-239829435571/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 09:46:19 crc kubenswrapper[4755]: I1006 09:46:19.570740 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8bb4b_6998032f-4cc5-4d30-8d2a-4c70731c20eb/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:19 crc kubenswrapper[4755]: I1006 09:46:19.798976 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9/nova-metadata-log/0.log" Oct 06 09:46:20 crc kubenswrapper[4755]: I1006 09:46:20.259918 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e27e627a-4f73-40cb-8ca0-e01c190266d3/nova-scheduler-scheduler/0.log" Oct 06 09:46:20 crc kubenswrapper[4755]: I1006 09:46:20.803979 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0ea480ba-e1ea-47db-b647-39833517fcad/mysql-bootstrap/0.log" Oct 06 09:46:20 crc kubenswrapper[4755]: I1006 09:46:20.978681 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0ea480ba-e1ea-47db-b647-39833517fcad/mysql-bootstrap/0.log" Oct 06 09:46:21 crc kubenswrapper[4755]: I1006 09:46:21.016323 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0ea480ba-e1ea-47db-b647-39833517fcad/galera/0.log" Oct 06 09:46:21 crc kubenswrapper[4755]: I1006 09:46:21.253929 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b13f13fe-a34f-4566-b0bd-31b326722b01/mysql-bootstrap/0.log" Oct 06 09:46:21 crc kubenswrapper[4755]: I1006 09:46:21.566741 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b13f13fe-a34f-4566-b0bd-31b326722b01/mysql-bootstrap/0.log" Oct 06 09:46:21 crc kubenswrapper[4755]: I1006 09:46:21.887129 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b13f13fe-a34f-4566-b0bd-31b326722b01/galera/0.log" Oct 06 09:46:21 crc kubenswrapper[4755]: I1006 09:46:21.978718 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d0ab2ce5-aa43-4d23-ae5c-26cb00ed04c9/nova-metadata-metadata/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.116652 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9805ff75-3e68-41eb-a711-ecc8e70ee16a/openstackclient/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.310847 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b4rd2_5dbdee79-0740-4068-a155-e865fe787402/ovn-controller/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.480737 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-flm4c_c01d8771-a0d8-436a-883d-5fb95dec3b59/openstack-network-exporter/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.573988 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jd94b_5a6dfcfa-6e0a-427c-88df-0619afb0195c/ovsdb-server-init/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.936017 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jd94b_5a6dfcfa-6e0a-427c-88df-0619afb0195c/ovsdb-server-init/0.log" Oct 06 09:46:22 crc kubenswrapper[4755]: I1006 09:46:22.939235 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jd94b_5a6dfcfa-6e0a-427c-88df-0619afb0195c/ovs-vswitchd/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.078457 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jd94b_5a6dfcfa-6e0a-427c-88df-0619afb0195c/ovsdb-server/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.269528 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fhz9w_c1401167-6476-4be8-8b96-e7c302f4d7f7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.434897 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8aa143d7-a987-43a1-992c-7b33b12710dd/openstack-network-exporter/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.547622 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8aa143d7-a987-43a1-992c-7b33b12710dd/ovn-northd/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.656547 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3327c559-a028-4094-be53-cc5c7c116a6f/openstack-network-exporter/0.log" Oct 06 09:46:23 crc kubenswrapper[4755]: I1006 09:46:23.874198 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3327c559-a028-4094-be53-cc5c7c116a6f/ovsdbserver-nb/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.007302 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d7cf61af-2469-48d4-b3e9-77267e7d5328/openstack-network-exporter/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.085622 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d7cf61af-2469-48d4-b3e9-77267e7d5328/ovsdbserver-sb/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.428209 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7646c5cd7b-lvntf_30667b83-ed3b-414b-af66-45b97ac252c1/placement-api/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.505808 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7646c5cd7b-lvntf_30667b83-ed3b-414b-af66-45b97ac252c1/placement-log/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.660660 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b378698d-a5e1-4538-93e2-694516a551b1/setup-container/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.923960 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b378698d-a5e1-4538-93e2-694516a551b1/setup-container/0.log" Oct 06 09:46:24 crc kubenswrapper[4755]: I1006 09:46:24.932523 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b378698d-a5e1-4538-93e2-694516a551b1/rabbitmq/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.126180 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6/setup-container/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.357211 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6/setup-container/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.372591 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5ba712cd-d4ba-44f6-a400-49b8ff9fa8b6/rabbitmq/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.614126 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-69svk_603da326-0d3a-43a3-b32b-f02d46177a85/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.674469 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jwvbk_7735ba28-55e1-42e4-8fee-463bb64a240a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:25 crc kubenswrapper[4755]: I1006 09:46:25.916090 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qvnpz_ffc3408e-1aa7-4d2b-b2d0-56106ba2f24e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:26 crc kubenswrapper[4755]: I1006 09:46:26.166053 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-64v5h_73b888de-77c2-4fbf-a443-37ce0a5c28d3/ssh-known-hosts-edpm-deployment/0.log" Oct 06 09:46:26 crc kubenswrapper[4755]: I1006 09:46:26.293740 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d5b996de-cff3-4a46-bfd0-25e7833f58e8/tempest-tests-tempest-tests-runner/0.log" Oct 06 09:46:26 crc kubenswrapper[4755]: I1006 09:46:26.515500 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_391f42bd-301b-4593-bc2d-3955a5114d27/test-operator-logs-container/0.log" Oct 06 09:46:26 crc kubenswrapper[4755]: I1006 09:46:26.736809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9dhpv_1246bd47-a86b-4708-8124-a77064659911/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 09:46:34 crc kubenswrapper[4755]: I1006 09:46:34.590418 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_70279926-92db-4788-b714-f14f60f4c55d/memcached/0.log" Oct 06 09:46:48 crc kubenswrapper[4755]: I1006 09:46:48.912475 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:46:48 crc kubenswrapper[4755]: I1006 09:46:48.913366 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:46:48 crc kubenswrapper[4755]: I1006 09:46:48.913445 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:46:48 crc kubenswrapper[4755]: I1006 09:46:48.914529 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:46:48 crc kubenswrapper[4755]: I1006 09:46:48.914658 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9" gracePeriod=600 Oct 06 09:46:49 crc kubenswrapper[4755]: I1006 09:46:49.471952 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9" exitCode=0 Oct 06 09:46:49 crc kubenswrapper[4755]: I1006 09:46:49.472297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9"} Oct 06 09:46:49 crc kubenswrapper[4755]: I1006 09:46:49.472331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerStarted","Data":"56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226"} Oct 06 09:46:49 crc kubenswrapper[4755]: I1006 09:46:49.472376 4755 scope.go:117] "RemoveContainer" containerID="b8195991974c93040022ecf123cc13df92156bbb00681beae7164dd40656cafd" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.604925 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:46:57 crc kubenswrapper[4755]: E1006 09:46:57.607972 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec107344-6f71-43d2-834f-d2678e380048" containerName="collect-profiles" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.607989 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec107344-6f71-43d2-834f-d2678e380048" containerName="collect-profiles" Oct 06 09:46:57 crc kubenswrapper[4755]: E1006 09:46:57.608019 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="extract-utilities" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.608026 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="extract-utilities" Oct 06 09:46:57 crc kubenswrapper[4755]: E1006 09:46:57.608047 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="extract-content" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.608055 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="extract-content" Oct 06 09:46:57 crc kubenswrapper[4755]: E1006 09:46:57.608068 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="registry-server" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.608075 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="registry-server" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.608252 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec107344-6f71-43d2-834f-d2678e380048" containerName="collect-profiles" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.608265 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4866ca28-1974-4a53-b993-5481258448b5" containerName="registry-server" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.609639 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.619400 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.680030 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8mw\" (UniqueName: \"kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.680234 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.680515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.782675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8mw\" (UniqueName: \"kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.783003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.783100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.783541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.783659 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.814324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8mw\" (UniqueName: \"kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw\") pod \"redhat-marketplace-tdsbr\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:57 crc kubenswrapper[4755]: I1006 09:46:57.948651 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:46:58 crc kubenswrapper[4755]: I1006 09:46:58.430915 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:46:58 crc kubenswrapper[4755]: W1006 09:46:58.432322 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32c81bd_1808_40d1_8e6e_ede7e091d994.slice/crio-c6916539184c55af39b5c04c3f458047c1c92cec642698e10ad60604bbd60f08 WatchSource:0}: Error finding container c6916539184c55af39b5c04c3f458047c1c92cec642698e10ad60604bbd60f08: Status 404 returned error can't find the container with id c6916539184c55af39b5c04c3f458047c1c92cec642698e10ad60604bbd60f08 Oct 06 09:46:58 crc kubenswrapper[4755]: I1006 09:46:58.588709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerStarted","Data":"c6916539184c55af39b5c04c3f458047c1c92cec642698e10ad60604bbd60f08"} Oct 06 09:46:59 crc kubenswrapper[4755]: I1006 09:46:59.599701 4755 generic.go:334] "Generic (PLEG): container finished" podID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerID="dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e" exitCode=0 Oct 06 09:46:59 crc kubenswrapper[4755]: I1006 09:46:59.599786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerDied","Data":"dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e"} Oct 06 09:46:59 crc kubenswrapper[4755]: I1006 09:46:59.602011 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:47:01 crc kubenswrapper[4755]: I1006 09:47:01.620925 4755 generic.go:334] "Generic (PLEG): container finished" podID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerID="a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af" exitCode=0 Oct 06 09:47:01 crc kubenswrapper[4755]: I1006 09:47:01.621032 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerDied","Data":"a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af"} Oct 06 09:47:02 crc kubenswrapper[4755]: I1006 09:47:02.632132 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerStarted","Data":"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735"} Oct 06 09:47:02 crc kubenswrapper[4755]: I1006 09:47:02.654667 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tdsbr" podStartSLOduration=3.188458497 podStartE2EDuration="5.654646947s" podCreationTimestamp="2025-10-06 09:46:57 +0000 UTC" firstStartedPulling="2025-10-06 09:46:59.601791689 +0000 UTC m=+5076.431106903" lastFinishedPulling="2025-10-06 09:47:02.067980139 +0000 UTC m=+5078.897295353" observedRunningTime="2025-10-06 09:47:02.648524117 +0000 UTC m=+5079.477839341" watchObservedRunningTime="2025-10-06 09:47:02.654646947 +0000 UTC m=+5079.483962161" Oct 06 09:47:07 crc kubenswrapper[4755]: I1006 09:47:07.949521 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:07 crc kubenswrapper[4755]: I1006 09:47:07.950390 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:08 crc kubenswrapper[4755]: I1006 09:47:08.030406 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:08 crc kubenswrapper[4755]: I1006 09:47:08.884924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:08 crc kubenswrapper[4755]: I1006 09:47:08.939378 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:47:09 crc kubenswrapper[4755]: I1006 09:47:09.705403 4755 generic.go:334] "Generic (PLEG): container finished" podID="8e653ab1-e205-4edc-b2ed-d74060995546" containerID="8d0b503c7d49ad9a62a220c5949e1ebf891b129c41413353f782dc30e38cc6f0" exitCode=0 Oct 06 09:47:09 crc kubenswrapper[4755]: I1006 09:47:09.705454 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-2clfh" event={"ID":"8e653ab1-e205-4edc-b2ed-d74060995546","Type":"ContainerDied","Data":"8d0b503c7d49ad9a62a220c5949e1ebf891b129c41413353f782dc30e38cc6f0"} Oct 06 09:47:10 crc kubenswrapper[4755]: I1006 09:47:10.716459 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tdsbr" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="registry-server" containerID="cri-o://68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735" gracePeriod=2 Oct 06 09:47:10 crc kubenswrapper[4755]: I1006 09:47:10.985444 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.045474 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-srw69/crc-debug-2clfh"] Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.054923 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-srw69/crc-debug-2clfh"] Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.078613 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvml\" (UniqueName: \"kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml\") pod \"8e653ab1-e205-4edc-b2ed-d74060995546\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.078924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host\") pod \"8e653ab1-e205-4edc-b2ed-d74060995546\" (UID: \"8e653ab1-e205-4edc-b2ed-d74060995546\") " Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.079195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host" (OuterVolumeSpecName: "host") pod "8e653ab1-e205-4edc-b2ed-d74060995546" (UID: "8e653ab1-e205-4edc-b2ed-d74060995546"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.079870 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e653ab1-e205-4edc-b2ed-d74060995546-host\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.086638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml" (OuterVolumeSpecName: "kube-api-access-rgvml") pod "8e653ab1-e205-4edc-b2ed-d74060995546" (UID: "8e653ab1-e205-4edc-b2ed-d74060995546"). InnerVolumeSpecName "kube-api-access-rgvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.182988 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvml\" (UniqueName: \"kubernetes.io/projected/8e653ab1-e205-4edc-b2ed-d74060995546-kube-api-access-rgvml\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.211961 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.284045 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities\") pod \"c32c81bd-1808-40d1-8e6e-ede7e091d994\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.284177 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z8mw\" (UniqueName: \"kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw\") pod \"c32c81bd-1808-40d1-8e6e-ede7e091d994\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.284242 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content\") pod \"c32c81bd-1808-40d1-8e6e-ede7e091d994\" (UID: \"c32c81bd-1808-40d1-8e6e-ede7e091d994\") " Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.285246 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities" (OuterVolumeSpecName: "utilities") pod "c32c81bd-1808-40d1-8e6e-ede7e091d994" (UID: "c32c81bd-1808-40d1-8e6e-ede7e091d994"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.290490 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw" (OuterVolumeSpecName: "kube-api-access-9z8mw") pod "c32c81bd-1808-40d1-8e6e-ede7e091d994" (UID: "c32c81bd-1808-40d1-8e6e-ede7e091d994"). InnerVolumeSpecName "kube-api-access-9z8mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.299547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c32c81bd-1808-40d1-8e6e-ede7e091d994" (UID: "c32c81bd-1808-40d1-8e6e-ede7e091d994"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.386469 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.386910 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32c81bd-1808-40d1-8e6e-ede7e091d994-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.386922 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z8mw\" (UniqueName: \"kubernetes.io/projected/c32c81bd-1808-40d1-8e6e-ede7e091d994-kube-api-access-9z8mw\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.732286 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f493413f5d394860c0e3497d4456d70607c0c6492eb0b828062c6b3eb452a0a6" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.732351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2clfh" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.738600 4755 generic.go:334] "Generic (PLEG): container finished" podID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerID="68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735" exitCode=0 Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.738663 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerDied","Data":"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735"} Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.738703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdsbr" event={"ID":"c32c81bd-1808-40d1-8e6e-ede7e091d994","Type":"ContainerDied","Data":"c6916539184c55af39b5c04c3f458047c1c92cec642698e10ad60604bbd60f08"} Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.738728 4755 scope.go:117] "RemoveContainer" containerID="68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.738930 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdsbr" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.816158 4755 scope.go:117] "RemoveContainer" containerID="a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.820636 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.832778 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdsbr"] Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.852156 4755 scope.go:117] "RemoveContainer" containerID="dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.900692 4755 scope.go:117] "RemoveContainer" containerID="68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735" Oct 06 09:47:11 crc kubenswrapper[4755]: E1006 09:47:11.901177 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735\": container with ID starting with 68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735 not found: ID does not exist" containerID="68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.901254 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735"} err="failed to get container status \"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735\": rpc error: code = NotFound desc = could not find container \"68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735\": container with ID starting with 68357baaaacfdbdb4840dc0bb795eb49e29c76d8516158ae98624681f510a735 not found: ID does not exist" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.901303 4755 scope.go:117] "RemoveContainer" containerID="a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af" Oct 06 09:47:11 crc kubenswrapper[4755]: E1006 09:47:11.903001 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af\": container with ID starting with a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af not found: ID does not exist" containerID="a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.903050 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af"} err="failed to get container status \"a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af\": rpc error: code = NotFound desc = could not find container \"a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af\": container with ID starting with a5bbc1d5d1f581b5a3e589e9e021bec489669746c39b1da28b31705c804ee4af not found: ID does not exist" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.903086 4755 scope.go:117] "RemoveContainer" containerID="dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e" Oct 06 09:47:11 crc kubenswrapper[4755]: E1006 09:47:11.903815 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e\": container with ID starting with dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e not found: ID does not exist" containerID="dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.904115 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e"} err="failed to get container status \"dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e\": rpc error: code = NotFound desc = could not find container \"dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e\": container with ID starting with dfc4d246c1ff47e89bfb33bb8f1e672ffb35247d6058c1487320931fdcef4d3e not found: ID does not exist" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.905223 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e653ab1-e205-4edc-b2ed-d74060995546" path="/var/lib/kubelet/pods/8e653ab1-e205-4edc-b2ed-d74060995546/volumes" Oct 06 09:47:11 crc kubenswrapper[4755]: I1006 09:47:11.906088 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" path="/var/lib/kubelet/pods/c32c81bd-1808-40d1-8e6e-ede7e091d994/volumes" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.251977 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-srw69/crc-debug-hxjbx"] Oct 06 09:47:12 crc kubenswrapper[4755]: E1006 09:47:12.252368 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e653ab1-e205-4edc-b2ed-d74060995546" containerName="container-00" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252381 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e653ab1-e205-4edc-b2ed-d74060995546" containerName="container-00" Oct 06 09:47:12 crc kubenswrapper[4755]: E1006 09:47:12.252407 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="registry-server" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252413 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="registry-server" Oct 06 09:47:12 crc kubenswrapper[4755]: E1006 09:47:12.252431 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="extract-utilities" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252438 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="extract-utilities" Oct 06 09:47:12 crc kubenswrapper[4755]: E1006 09:47:12.252446 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="extract-content" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252451 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="extract-content" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252683 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32c81bd-1808-40d1-8e6e-ede7e091d994" containerName="registry-server" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.252696 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e653ab1-e205-4edc-b2ed-d74060995546" containerName="container-00" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.253334 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.416554 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86zsr\" (UniqueName: \"kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.417149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.518706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.518900 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86zsr\" (UniqueName: \"kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.519398 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.535918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86zsr\" (UniqueName: \"kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr\") pod \"crc-debug-hxjbx\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.570135 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:12 crc kubenswrapper[4755]: I1006 09:47:12.752865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-hxjbx" event={"ID":"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7","Type":"ContainerStarted","Data":"45b415af4c7938a02b0e973df9edbe5612302a368a71fc032c38bd1eba8d99c8"} Oct 06 09:47:13 crc kubenswrapper[4755]: I1006 09:47:13.769678 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" containerID="48db72d5de3ba3487536b78650c402552fcbb2402066b811d9a825c7627c5c2f" exitCode=0 Oct 06 09:47:13 crc kubenswrapper[4755]: I1006 09:47:13.769847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-hxjbx" event={"ID":"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7","Type":"ContainerDied","Data":"48db72d5de3ba3487536b78650c402552fcbb2402066b811d9a825c7627c5c2f"} Oct 06 09:47:14 crc kubenswrapper[4755]: I1006 09:47:14.916547 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.064066 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86zsr\" (UniqueName: \"kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr\") pod \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.064783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host\") pod \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\" (UID: \"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7\") " Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.064848 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host" (OuterVolumeSpecName: "host") pod "2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" (UID: "2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.065395 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-host\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.082365 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr" (OuterVolumeSpecName: "kube-api-access-86zsr") pod "2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" (UID: "2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7"). InnerVolumeSpecName "kube-api-access-86zsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.166948 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86zsr\" (UniqueName: \"kubernetes.io/projected/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7-kube-api-access-86zsr\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.823769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-hxjbx" event={"ID":"2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7","Type":"ContainerDied","Data":"45b415af4c7938a02b0e973df9edbe5612302a368a71fc032c38bd1eba8d99c8"} Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.825220 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b415af4c7938a02b0e973df9edbe5612302a368a71fc032c38bd1eba8d99c8" Oct 06 09:47:15 crc kubenswrapper[4755]: I1006 09:47:15.824159 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-hxjbx" Oct 06 09:47:23 crc kubenswrapper[4755]: I1006 09:47:23.375454 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-srw69/crc-debug-hxjbx"] Oct 06 09:47:23 crc kubenswrapper[4755]: I1006 09:47:23.387341 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-srw69/crc-debug-hxjbx"] Oct 06 09:47:23 crc kubenswrapper[4755]: I1006 09:47:23.906120 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" path="/var/lib/kubelet/pods/2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7/volumes" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.598312 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-srw69/crc-debug-2xvqn"] Oct 06 09:47:24 crc kubenswrapper[4755]: E1006 09:47:24.599832 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" containerName="container-00" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.599852 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" containerName="container-00" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.600454 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3d870a-16cb-47bc-a1ec-3b6f886dd4c7" containerName="container-00" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.609609 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.749269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.749625 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcxw\" (UniqueName: \"kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.852035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcxw\" (UniqueName: \"kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.852212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.852355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.886323 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcxw\" (UniqueName: \"kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw\") pod \"crc-debug-2xvqn\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:24 crc kubenswrapper[4755]: I1006 09:47:24.966181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:25 crc kubenswrapper[4755]: W1006 09:47:25.024333 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe002e16_82ab_40d6_a5f2_a5c47c948ee5.slice/crio-c70a2e38e19eb1afd4057984a5eb6edefde5da2bbee58f88df6dbafaa1d47511 WatchSource:0}: Error finding container c70a2e38e19eb1afd4057984a5eb6edefde5da2bbee58f88df6dbafaa1d47511: Status 404 returned error can't find the container with id c70a2e38e19eb1afd4057984a5eb6edefde5da2bbee58f88df6dbafaa1d47511 Oct 06 09:47:25 crc kubenswrapper[4755]: I1006 09:47:25.934588 4755 generic.go:334] "Generic (PLEG): container finished" podID="be002e16-82ab-40d6-a5f2-a5c47c948ee5" containerID="fe841bac6db37032dd3aae1e337599215fb36fa06cae6bb0c76f7f63d2fe3a29" exitCode=0 Oct 06 09:47:25 crc kubenswrapper[4755]: I1006 09:47:25.934628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-2xvqn" event={"ID":"be002e16-82ab-40d6-a5f2-a5c47c948ee5","Type":"ContainerDied","Data":"fe841bac6db37032dd3aae1e337599215fb36fa06cae6bb0c76f7f63d2fe3a29"} Oct 06 09:47:25 crc kubenswrapper[4755]: I1006 09:47:25.935289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/crc-debug-2xvqn" event={"ID":"be002e16-82ab-40d6-a5f2-a5c47c948ee5","Type":"ContainerStarted","Data":"c70a2e38e19eb1afd4057984a5eb6edefde5da2bbee58f88df6dbafaa1d47511"} Oct 06 09:47:25 crc kubenswrapper[4755]: I1006 09:47:25.997621 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-srw69/crc-debug-2xvqn"] Oct 06 09:47:26 crc kubenswrapper[4755]: I1006 09:47:26.013794 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-srw69/crc-debug-2xvqn"] Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.054761 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.204952 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdcxw\" (UniqueName: \"kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw\") pod \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.205154 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host\") pod \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\" (UID: \"be002e16-82ab-40d6-a5f2-a5c47c948ee5\") " Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.205622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host" (OuterVolumeSpecName: "host") pod "be002e16-82ab-40d6-a5f2-a5c47c948ee5" (UID: "be002e16-82ab-40d6-a5f2-a5c47c948ee5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.206084 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be002e16-82ab-40d6-a5f2-a5c47c948ee5-host\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.223928 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw" (OuterVolumeSpecName: "kube-api-access-fdcxw") pod "be002e16-82ab-40d6-a5f2-a5c47c948ee5" (UID: "be002e16-82ab-40d6-a5f2-a5c47c948ee5"). InnerVolumeSpecName "kube-api-access-fdcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.308756 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdcxw\" (UniqueName: \"kubernetes.io/projected/be002e16-82ab-40d6-a5f2-a5c47c948ee5-kube-api-access-fdcxw\") on node \"crc\" DevicePath \"\"" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.893312 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be002e16-82ab-40d6-a5f2-a5c47c948ee5" path="/var/lib/kubelet/pods/be002e16-82ab-40d6-a5f2-a5c47c948ee5/volumes" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.894799 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/util/0.log" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.954647 4755 scope.go:117] "RemoveContainer" containerID="fe841bac6db37032dd3aae1e337599215fb36fa06cae6bb0c76f7f63d2fe3a29" Oct 06 09:47:27 crc kubenswrapper[4755]: I1006 09:47:27.954856 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/crc-debug-2xvqn" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.159107 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/util/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.180377 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/pull/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.184119 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/pull/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.356367 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/util/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.395057 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/extract/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.472956 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_075f6ecbefce1054966c006f92b158e5946d1db226e752fb72e26bb260cs545_a7e104a0-7135-4172-b7c1-5edd90949112/pull/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.612552 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-lbg2n_cc16a1c7-6450-414d-9e4b-518014071887/kube-rbac-proxy/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.673424 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-lbg2n_cc16a1c7-6450-414d-9e4b-518014071887/manager/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.697863 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78fdc95566-rdwj9_5e7b409c-75ff-43bf-87e1-9a7877dd21f3/kube-rbac-proxy/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.843719 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78fdc95566-rdwj9_5e7b409c-75ff-43bf-87e1-9a7877dd21f3/manager/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.902508 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-zs28l_1412ad22-876d-4924-9f9b-468970063426/kube-rbac-proxy/0.log" Oct 06 09:47:28 crc kubenswrapper[4755]: I1006 09:47:28.926180 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-zs28l_1412ad22-876d-4924-9f9b-468970063426/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.100529 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-w6rtg_169c4b1e-417b-4ddf-9886-6c4668257712/kube-rbac-proxy/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.205748 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-w6rtg_169c4b1e-417b-4ddf-9886-6c4668257712/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.276829 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-jnt7p_2c9d42bf-9896-4198-aeb9-352d080978d0/kube-rbac-proxy/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.312816 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-jnt7p_2c9d42bf-9896-4198-aeb9-352d080978d0/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.399908 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-9qkff_d33c2461-9722-468a-b6e4-20b4ce822f18/kube-rbac-proxy/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.482107 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-9qkff_d33c2461-9722-468a-b6e4-20b4ce822f18/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.597554 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6nqcm_cc16e4a5-7b17-4f64-840e-1d0f6971c7a4/kube-rbac-proxy/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.773957 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-ld9kw_206b28f4-45a9-4352-bb98-717c408dfcac/kube-rbac-proxy/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.799405 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6nqcm_cc16e4a5-7b17-4f64-840e-1d0f6971c7a4/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.851001 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-ld9kw_206b28f4-45a9-4352-bb98-717c408dfcac/manager/0.log" Oct 06 09:47:29 crc kubenswrapper[4755]: I1006 09:47:29.965435 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-6kmrp_5fbfa495-f4d7-4bf8-a489-f8d24476fbf2/kube-rbac-proxy/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.070452 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-6kmrp_5fbfa495-f4d7-4bf8-a489-f8d24476fbf2/manager/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.186299 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-zkdrt_26edd385-18c0-41cf-8094-e1844f07364a/kube-rbac-proxy/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.296234 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-zkdrt_26edd385-18c0-41cf-8094-e1844f07364a/manager/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.348963 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q_a151d352-3084-45b5-80b6-48510de0f087/kube-rbac-proxy/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.429337 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-mcg7q_a151d352-3084-45b5-80b6-48510de0f087/manager/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.547262 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-6cfbt_fa923997-ddf0-4e0a-9ef9-bde22b553dfb/kube-rbac-proxy/0.log" Oct 06 09:47:30 crc kubenswrapper[4755]: I1006 09:47:30.600915 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-6cfbt_fa923997-ddf0-4e0a-9ef9-bde22b553dfb/manager/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.101979 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-k5658_188c5ff1-ba40-4c30-b411-d5beb8cdb4e8/kube-rbac-proxy/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.116467 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-nk299_771c3503-d156-46de-81b3-fe3845ecd58f/kube-rbac-proxy/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.148150 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-k5658_188c5ff1-ba40-4c30-b411-d5beb8cdb4e8/manager/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.178615 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-nk299_771c3503-d156-46de-81b3-fe3845ecd58f/manager/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.293743 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9_1a519b4b-8c97-4154-b87c-2cbd91e4453b/kube-rbac-proxy/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.347906 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c2c8z9_1a519b4b-8c97-4154-b87c-2cbd91e4453b/manager/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.377820 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7f9f8b87ff-hgpl2_f9f27028-c7a1-4bee-bb82-41c4b0354da1/kube-rbac-proxy/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.574915 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-74fdc89789-h88l2_a0cbb4ec-8e43-4a88-a2b6-b516c6017546/kube-rbac-proxy/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.725686 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-74fdc89789-h88l2_a0cbb4ec-8e43-4a88-a2b6-b516c6017546/operator/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.812911 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvmrb_8031183e-bb8c-4447-8853-cc9a3b0a771f/registry-server/0.log" Oct 06 09:47:31 crc kubenswrapper[4755]: I1006 09:47:31.860839 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-gpblg_96e87134-c1a1-49fb-9e05-59e10699741f/kube-rbac-proxy/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.065011 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-gpblg_96e87134-c1a1-49fb-9e05-59e10699741f/manager/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.187994 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-555f5_746ef71e-2879-4789-879d-a4479700346e/kube-rbac-proxy/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.243843 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-555f5_746ef71e-2879-4789-879d-a4479700346e/manager/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.434884 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-zmvdb_1091a8d9-172e-4016-b354-16329cdab528/operator/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.496977 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dbmlb_c0fe017f-b521-4146-a2fe-7b790d585e22/kube-rbac-proxy/0.log" Oct 06 09:47:32 crc kubenswrapper[4755]: I1006 09:47:32.563678 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dbmlb_c0fe017f-b521-4146-a2fe-7b790d585e22/manager/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.147868 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7f9f8b87ff-hgpl2_f9f27028-c7a1-4bee-bb82-41c4b0354da1/manager/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.352354 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-n6gl6_57a14562-fc08-4785-a24c-ead1cb0919e6/kube-rbac-proxy/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.370657 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-8sf8c_73a255d0-1e8a-43c9-b27c-e7ff650c3c79/kube-rbac-proxy/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.446505 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-n6gl6_57a14562-fc08-4785-a24c-ead1cb0919e6/manager/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.471769 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-8sf8c_73a255d0-1e8a-43c9-b27c-e7ff650c3c79/manager/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.579204 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gjkcw_51cb4df4-8c6a-4563-bd07-9c05182b4216/kube-rbac-proxy/0.log" Oct 06 09:47:33 crc kubenswrapper[4755]: I1006 09:47:33.611859 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gjkcw_51cb4df4-8c6a-4563-bd07-9c05182b4216/manager/0.log" Oct 06 09:47:52 crc kubenswrapper[4755]: I1006 09:47:52.563669 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qxwpv_343af78f-ce0c-4feb-a8d9-38c5a524b342/control-plane-machine-set-operator/0.log" Oct 06 09:47:52 crc kubenswrapper[4755]: I1006 09:47:52.811942 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2lqpg_39386f6f-4d16-4a81-9432-e486d9e6ee60/machine-api-operator/0.log" Oct 06 09:47:52 crc kubenswrapper[4755]: I1006 09:47:52.812878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2lqpg_39386f6f-4d16-4a81-9432-e486d9e6ee60/kube-rbac-proxy/0.log" Oct 06 09:48:09 crc kubenswrapper[4755]: I1006 09:48:09.599986 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-px4lg_8f030de0-4449-4711-aa8e-9429fc81e43b/cert-manager-controller/0.log" Oct 06 09:48:09 crc kubenswrapper[4755]: I1006 09:48:09.795513 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6lntj_543c8799-3a5d-49b4-b39e-8e2ca0a055df/cert-manager-cainjector/0.log" Oct 06 09:48:09 crc kubenswrapper[4755]: I1006 09:48:09.819505 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-xdqhx_2c101652-98d5-42e2-be82-f8058baf0fa9/cert-manager-webhook/0.log" Oct 06 09:48:24 crc kubenswrapper[4755]: I1006 09:48:24.564300 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-vf4h7_260d013b-89f5-4f12-959f-fe21b8a52fc6/nmstate-console-plugin/0.log" Oct 06 09:48:25 crc kubenswrapper[4755]: I1006 09:48:25.386459 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q99m9_5209c2e5-4435-4b41-950e-b1909e4853dc/nmstate-handler/0.log" Oct 06 09:48:25 crc kubenswrapper[4755]: I1006 09:48:25.626272 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2ltnx_10aaa9f0-000d-46a3-8108-9e1f04820012/kube-rbac-proxy/0.log" Oct 06 09:48:25 crc kubenswrapper[4755]: I1006 09:48:25.646168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2ltnx_10aaa9f0-000d-46a3-8108-9e1f04820012/nmstate-metrics/0.log" Oct 06 09:48:25 crc kubenswrapper[4755]: I1006 09:48:25.788664 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-dxh86_d3aab626-14e3-4151-b7ea-7af710945fee/nmstate-webhook/0.log" Oct 06 09:48:25 crc kubenswrapper[4755]: I1006 09:48:25.849714 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-kklmt_3d1e2fed-d6da-41b0-8fb3-216a7563269e/nmstate-operator/0.log" Oct 06 09:48:42 crc kubenswrapper[4755]: I1006 09:48:42.820198 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rlwnm_0b397346-b157-4cbf-a489-07ba1c76a602/kube-rbac-proxy/0.log" Oct 06 09:48:42 crc kubenswrapper[4755]: I1006 09:48:42.934677 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rlwnm_0b397346-b157-4cbf-a489-07ba1c76a602/controller/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.071840 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-frr-files/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.273097 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-metrics/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.279595 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-frr-files/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.280003 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-reloader/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.317270 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-reloader/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.499932 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-reloader/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.504763 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-frr-files/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.554729 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-metrics/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.593132 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-metrics/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.772206 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-frr-files/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.790278 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-reloader/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.808557 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/controller/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.809980 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/cp-metrics/0.log" Oct 06 09:48:43 crc kubenswrapper[4755]: I1006 09:48:43.992066 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/frr-metrics/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.083365 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/kube-rbac-proxy/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.100969 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/kube-rbac-proxy-frr/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.231181 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/reloader/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.379678 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-fft4d_7c4e25ec-c928-4063-9c6b-2166042d476e/frr-k8s-webhook-server/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.604045 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-746849c9fd-kwxmk_f3ddc2a6-55c5-42e4-bc84-82413245a1a6/manager/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.701621 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-688944c458-dszrn_fccaa716-5ad5-4994-b4c3-352cc2a11a2e/webhook-server/0.log" Oct 06 09:48:44 crc kubenswrapper[4755]: I1006 09:48:44.851809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4rh89_5b7e2120-cc02-4414-a2e4-55e198617480/kube-rbac-proxy/0.log" Oct 06 09:48:45 crc kubenswrapper[4755]: I1006 09:48:45.371615 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4rh89_5b7e2120-cc02-4414-a2e4-55e198617480/speaker/0.log" Oct 06 09:48:45 crc kubenswrapper[4755]: I1006 09:48:45.546755 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gt2qg_17120783-c2c7-4718-8a90-e89951659106/frr/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.411694 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/util/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.668214 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/pull/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.717379 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/util/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.737675 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/pull/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.878217 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/util/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.920964 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/pull/0.log" Oct 06 09:49:02 crc kubenswrapper[4755]: I1006 09:49:02.937723 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29mxd6_274bae86-c37c-47a1-9f5a-842fec70c251/extract/0.log" Oct 06 09:49:03 crc kubenswrapper[4755]: I1006 09:49:03.788456 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-utilities/0.log" Oct 06 09:49:03 crc kubenswrapper[4755]: I1006 09:49:03.955062 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-content/0.log" Oct 06 09:49:03 crc kubenswrapper[4755]: I1006 09:49:03.964125 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-content/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.003359 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-utilities/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.188819 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-content/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.220853 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/extract-utilities/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.466234 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-utilities/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.694693 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-utilities/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.709986 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-content/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.720117 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-content/0.log" Oct 06 09:49:04 crc kubenswrapper[4755]: I1006 09:49:04.801983 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p8wxk_6ce9263d-ebe4-4a6e-ba20-014e6f7d6b08/registry-server/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.603663 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-utilities/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.650168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/extract-content/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.775772 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/util/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.937930 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/pull/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.938878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/pull/0.log" Oct 06 09:49:05 crc kubenswrapper[4755]: I1006 09:49:05.989898 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/util/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.136183 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/pull/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.160741 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/util/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.175555 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c8tt5t_d8dffa79-06e6-40e3-9769-541d9af8f0f8/extract/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.270015 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mx9pz_a5c7c681-770b-49f0-aeae-e751bedb73c0/registry-server/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.369945 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-utilities/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.401788 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p4ld2_41630c1b-822f-4194-a858-b5f9868ad9e6/marketplace-operator/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.515595 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-utilities/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.530099 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-content/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.535828 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-content/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.779747 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-utilities/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.779902 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-utilities/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.793664 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/extract-content/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.863268 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8z4xp_92d7f1d8-b288-4877-9ab3-e3710c46ad0d/registry-server/0.log" Oct 06 09:49:06 crc kubenswrapper[4755]: I1006 09:49:06.985961 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-content/0.log" Oct 06 09:49:07 crc kubenswrapper[4755]: I1006 09:49:07.000615 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-utilities/0.log" Oct 06 09:49:07 crc kubenswrapper[4755]: I1006 09:49:07.038235 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-content/0.log" Oct 06 09:49:07 crc kubenswrapper[4755]: I1006 09:49:07.204155 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-utilities/0.log" Oct 06 09:49:07 crc kubenswrapper[4755]: I1006 09:49:07.256344 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/extract-content/0.log" Oct 06 09:49:07 crc kubenswrapper[4755]: I1006 09:49:07.723049 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jbt9h_9f819d94-d78c-453c-92f7-2e2f66c4f5b8/registry-server/0.log" Oct 06 09:49:18 crc kubenswrapper[4755]: I1006 09:49:18.912691 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:49:18 crc kubenswrapper[4755]: I1006 09:49:18.913523 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:49:48 crc kubenswrapper[4755]: I1006 09:49:48.912432 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:49:48 crc kubenswrapper[4755]: I1006 09:49:48.913501 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:50:18 crc kubenswrapper[4755]: I1006 09:50:18.912555 4755 patch_prober.go:28] interesting pod/machine-config-daemon-rfqsq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:50:18 crc kubenswrapper[4755]: I1006 09:50:18.913465 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:50:18 crc kubenswrapper[4755]: I1006 09:50:18.913547 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" Oct 06 09:50:18 crc kubenswrapper[4755]: I1006 09:50:18.915016 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226"} pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:50:18 crc kubenswrapper[4755]: I1006 09:50:18.915127 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerName="machine-config-daemon" containerID="cri-o://56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" gracePeriod=600 Oct 06 09:50:19 crc kubenswrapper[4755]: E1006 09:50:19.050253 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:50:19 crc kubenswrapper[4755]: I1006 09:50:19.072469 4755 generic.go:334] "Generic (PLEG): container finished" podID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" exitCode=0 Oct 06 09:50:19 crc kubenswrapper[4755]: I1006 09:50:19.072556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" event={"ID":"854f4c9e-3c8a-47bb-9427-bb5bfc5691d7","Type":"ContainerDied","Data":"56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226"} Oct 06 09:50:19 crc kubenswrapper[4755]: I1006 09:50:19.072665 4755 scope.go:117] "RemoveContainer" containerID="23a9d9458ee8bfc68906867f6ba2d0cc07d6b8cdc0736c7276d2a9fe7b88f3f9" Oct 06 09:50:19 crc kubenswrapper[4755]: I1006 09:50:19.075345 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:50:19 crc kubenswrapper[4755]: E1006 09:50:19.081489 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:50:31 crc kubenswrapper[4755]: I1006 09:50:31.879921 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:50:31 crc kubenswrapper[4755]: E1006 09:50:31.881731 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:50:43 crc kubenswrapper[4755]: I1006 09:50:43.886144 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:50:43 crc kubenswrapper[4755]: E1006 09:50:43.886966 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:50:56 crc kubenswrapper[4755]: I1006 09:50:56.881278 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:50:56 crc kubenswrapper[4755]: E1006 09:50:56.882683 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:10 crc kubenswrapper[4755]: I1006 09:51:10.880694 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:51:10 crc kubenswrapper[4755]: E1006 09:51:10.885119 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:22 crc kubenswrapper[4755]: I1006 09:51:22.879872 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:51:22 crc kubenswrapper[4755]: E1006 09:51:22.881110 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:31 crc kubenswrapper[4755]: I1006 09:51:31.042607 4755 generic.go:334] "Generic (PLEG): container finished" podID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerID="d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f" exitCode=0 Oct 06 09:51:31 crc kubenswrapper[4755]: I1006 09:51:31.042741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-srw69/must-gather-rqdkn" event={"ID":"8410ff38-76bd-40d9-99e3-7a6e1d85c220","Type":"ContainerDied","Data":"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f"} Oct 06 09:51:31 crc kubenswrapper[4755]: I1006 09:51:31.044905 4755 scope.go:117] "RemoveContainer" containerID="d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f" Oct 06 09:51:31 crc kubenswrapper[4755]: I1006 09:51:31.357089 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-srw69_must-gather-rqdkn_8410ff38-76bd-40d9-99e3-7a6e1d85c220/gather/0.log" Oct 06 09:51:33 crc kubenswrapper[4755]: I1006 09:51:33.897489 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:51:33 crc kubenswrapper[4755]: E1006 09:51:33.898791 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:40 crc kubenswrapper[4755]: I1006 09:51:40.444049 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-srw69/must-gather-rqdkn"] Oct 06 09:51:40 crc kubenswrapper[4755]: I1006 09:51:40.445001 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-srw69/must-gather-rqdkn" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="copy" containerID="cri-o://3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322" gracePeriod=2 Oct 06 09:51:40 crc kubenswrapper[4755]: I1006 09:51:40.453713 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-srw69/must-gather-rqdkn"] Oct 06 09:51:40 crc kubenswrapper[4755]: I1006 09:51:40.958360 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-srw69_must-gather-rqdkn_8410ff38-76bd-40d9-99e3-7a6e1d85c220/copy/0.log" Oct 06 09:51:40 crc kubenswrapper[4755]: I1006 09:51:40.960834 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.116182 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output\") pod \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.116700 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdsrc\" (UniqueName: \"kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc\") pod \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\" (UID: \"8410ff38-76bd-40d9-99e3-7a6e1d85c220\") " Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.135908 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc" (OuterVolumeSpecName: "kube-api-access-bdsrc") pod "8410ff38-76bd-40d9-99e3-7a6e1d85c220" (UID: "8410ff38-76bd-40d9-99e3-7a6e1d85c220"). InnerVolumeSpecName "kube-api-access-bdsrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.188784 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-srw69_must-gather-rqdkn_8410ff38-76bd-40d9-99e3-7a6e1d85c220/copy/0.log" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.189629 4755 generic.go:334] "Generic (PLEG): container finished" podID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerID="3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322" exitCode=143 Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.189692 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-srw69/must-gather-rqdkn" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.189696 4755 scope.go:117] "RemoveContainer" containerID="3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.210619 4755 scope.go:117] "RemoveContainer" containerID="d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.219521 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdsrc\" (UniqueName: \"kubernetes.io/projected/8410ff38-76bd-40d9-99e3-7a6e1d85c220-kube-api-access-bdsrc\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.262022 4755 scope.go:117] "RemoveContainer" containerID="3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322" Oct 06 09:51:41 crc kubenswrapper[4755]: E1006 09:51:41.262937 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322\": container with ID starting with 3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322 not found: ID does not exist" containerID="3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.262997 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322"} err="failed to get container status \"3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322\": rpc error: code = NotFound desc = could not find container \"3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322\": container with ID starting with 3a8570884caf7afd9353a54d28a3a956853d986fcaa699c3a2840d8e71a9c322 not found: ID does not exist" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.263049 4755 scope.go:117] "RemoveContainer" containerID="d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f" Oct 06 09:51:41 crc kubenswrapper[4755]: E1006 09:51:41.263319 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f\": container with ID starting with d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f not found: ID does not exist" containerID="d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.263359 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f"} err="failed to get container status \"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f\": rpc error: code = NotFound desc = could not find container \"d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f\": container with ID starting with d3f1ee475eef78ddc12712e7ce8dfb2dce36432fba50882d5cde8639ff2cb04f not found: ID does not exist" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.290675 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8410ff38-76bd-40d9-99e3-7a6e1d85c220" (UID: "8410ff38-76bd-40d9-99e3-7a6e1d85c220"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.321525 4755 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8410ff38-76bd-40d9-99e3-7a6e1d85c220-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:41 crc kubenswrapper[4755]: I1006 09:51:41.900169 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" path="/var/lib/kubelet/pods/8410ff38-76bd-40d9-99e3-7a6e1d85c220/volumes" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.465546 4755 scope.go:117] "RemoveContainer" containerID="8d0b503c7d49ad9a62a220c5949e1ebf891b129c41413353f782dc30e38cc6f0" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.673438 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:43 crc kubenswrapper[4755]: E1006 09:51:43.678226 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="gather" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678266 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="gather" Oct 06 09:51:43 crc kubenswrapper[4755]: E1006 09:51:43.678303 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="copy" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678310 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="copy" Oct 06 09:51:43 crc kubenswrapper[4755]: E1006 09:51:43.678325 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be002e16-82ab-40d6-a5f2-a5c47c948ee5" containerName="container-00" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678333 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be002e16-82ab-40d6-a5f2-a5c47c948ee5" containerName="container-00" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678517 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="gather" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="be002e16-82ab-40d6-a5f2-a5c47c948ee5" containerName="container-00" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.678550 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8410ff38-76bd-40d9-99e3-7a6e1d85c220" containerName="copy" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.680034 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.696512 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.789367 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.789875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctng\" (UniqueName: \"kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.790116 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.893178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.893323 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.893472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctng\" (UniqueName: \"kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.894233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.894290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:43 crc kubenswrapper[4755]: I1006 09:51:43.925456 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctng\" (UniqueName: \"kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng\") pod \"certified-operators-lwxm2\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:44 crc kubenswrapper[4755]: I1006 09:51:44.014510 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:44 crc kubenswrapper[4755]: I1006 09:51:44.634977 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:45 crc kubenswrapper[4755]: I1006 09:51:45.248682 4755 generic.go:334] "Generic (PLEG): container finished" podID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerID="29ddbd5c450cbd9c9b08e824c55409acffe90fb052bb4e6e1cd8ef496f971f8a" exitCode=0 Oct 06 09:51:45 crc kubenswrapper[4755]: I1006 09:51:45.248731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerDied","Data":"29ddbd5c450cbd9c9b08e824c55409acffe90fb052bb4e6e1cd8ef496f971f8a"} Oct 06 09:51:45 crc kubenswrapper[4755]: I1006 09:51:45.248763 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerStarted","Data":"b9867a48f6e2c5c385eb5cfd9df272b0adca7a387bc60c22751b2336a1a1fa09"} Oct 06 09:51:46 crc kubenswrapper[4755]: I1006 09:51:46.268233 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerStarted","Data":"7b811ca24768a19b60b260f4062a1151f021ce884c810fe3bd12113da50c14f0"} Oct 06 09:51:47 crc kubenswrapper[4755]: I1006 09:51:47.880788 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:51:47 crc kubenswrapper[4755]: E1006 09:51:47.881959 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:48 crc kubenswrapper[4755]: I1006 09:51:48.306988 4755 generic.go:334] "Generic (PLEG): container finished" podID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerID="7b811ca24768a19b60b260f4062a1151f021ce884c810fe3bd12113da50c14f0" exitCode=0 Oct 06 09:51:48 crc kubenswrapper[4755]: I1006 09:51:48.307463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerDied","Data":"7b811ca24768a19b60b260f4062a1151f021ce884c810fe3bd12113da50c14f0"} Oct 06 09:51:49 crc kubenswrapper[4755]: I1006 09:51:49.321539 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerStarted","Data":"f7d333877af755ad390256a26102dc212a9af3037723a14ae926378f408a7d50"} Oct 06 09:51:49 crc kubenswrapper[4755]: I1006 09:51:49.347367 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lwxm2" podStartSLOduration=2.846632333 podStartE2EDuration="6.347347276s" podCreationTimestamp="2025-10-06 09:51:43 +0000 UTC" firstStartedPulling="2025-10-06 09:51:45.251027687 +0000 UTC m=+5362.080342901" lastFinishedPulling="2025-10-06 09:51:48.7517426 +0000 UTC m=+5365.581057844" observedRunningTime="2025-10-06 09:51:49.34149001 +0000 UTC m=+5366.170805254" watchObservedRunningTime="2025-10-06 09:51:49.347347276 +0000 UTC m=+5366.176662500" Oct 06 09:51:54 crc kubenswrapper[4755]: I1006 09:51:54.015224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:54 crc kubenswrapper[4755]: I1006 09:51:54.016136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:54 crc kubenswrapper[4755]: I1006 09:51:54.840531 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:54 crc kubenswrapper[4755]: I1006 09:51:54.934610 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:55 crc kubenswrapper[4755]: I1006 09:51:55.103513 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:56 crc kubenswrapper[4755]: I1006 09:51:56.433180 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lwxm2" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="registry-server" containerID="cri-o://f7d333877af755ad390256a26102dc212a9af3037723a14ae926378f408a7d50" gracePeriod=2 Oct 06 09:51:57 crc kubenswrapper[4755]: I1006 09:51:57.454054 4755 generic.go:334] "Generic (PLEG): container finished" podID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerID="f7d333877af755ad390256a26102dc212a9af3037723a14ae926378f408a7d50" exitCode=0 Oct 06 09:51:57 crc kubenswrapper[4755]: I1006 09:51:57.454168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerDied","Data":"f7d333877af755ad390256a26102dc212a9af3037723a14ae926378f408a7d50"} Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.118626 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.238830 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dctng\" (UniqueName: \"kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng\") pod \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.239429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities\") pod \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.240292 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content\") pod \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\" (UID: \"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1\") " Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.240940 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities" (OuterVolumeSpecName: "utilities") pod "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" (UID: "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.241376 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.249816 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng" (OuterVolumeSpecName: "kube-api-access-dctng") pod "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" (UID: "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1"). InnerVolumeSpecName "kube-api-access-dctng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.315749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" (UID: "751d12b1-5d6e-48f7-9feb-1a81a74eb8f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.344270 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.344323 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dctng\" (UniqueName: \"kubernetes.io/projected/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1-kube-api-access-dctng\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.479374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lwxm2" event={"ID":"751d12b1-5d6e-48f7-9feb-1a81a74eb8f1","Type":"ContainerDied","Data":"b9867a48f6e2c5c385eb5cfd9df272b0adca7a387bc60c22751b2336a1a1fa09"} Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.479469 4755 scope.go:117] "RemoveContainer" containerID="f7d333877af755ad390256a26102dc212a9af3037723a14ae926378f408a7d50" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.479458 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lwxm2" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.505295 4755 scope.go:117] "RemoveContainer" containerID="7b811ca24768a19b60b260f4062a1151f021ce884c810fe3bd12113da50c14f0" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.538411 4755 scope.go:117] "RemoveContainer" containerID="29ddbd5c450cbd9c9b08e824c55409acffe90fb052bb4e6e1cd8ef496f971f8a" Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.542952 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:58 crc kubenswrapper[4755]: I1006 09:51:58.552484 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lwxm2"] Oct 06 09:51:59 crc kubenswrapper[4755]: I1006 09:51:59.880549 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:51:59 crc kubenswrapper[4755]: E1006 09:51:59.881458 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:51:59 crc kubenswrapper[4755]: I1006 09:51:59.899958 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" path="/var/lib/kubelet/pods/751d12b1-5d6e-48f7-9feb-1a81a74eb8f1/volumes" Oct 06 09:52:12 crc kubenswrapper[4755]: I1006 09:52:12.880515 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:52:12 crc kubenswrapper[4755]: E1006 09:52:12.882052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:52:25 crc kubenswrapper[4755]: I1006 09:52:25.881309 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:52:25 crc kubenswrapper[4755]: E1006 09:52:25.882549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:52:37 crc kubenswrapper[4755]: I1006 09:52:37.880484 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:52:37 crc kubenswrapper[4755]: E1006 09:52:37.882397 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:52:49 crc kubenswrapper[4755]: I1006 09:52:49.879293 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:52:49 crc kubenswrapper[4755]: E1006 09:52:49.880540 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.250908 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:03 crc kubenswrapper[4755]: E1006 09:53:03.252903 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="extract-utilities" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.252931 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="extract-utilities" Oct 06 09:53:03 crc kubenswrapper[4755]: E1006 09:53:03.253002 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="extract-content" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.253017 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="extract-content" Oct 06 09:53:03 crc kubenswrapper[4755]: E1006 09:53:03.253081 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="registry-server" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.253096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="registry-server" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.253546 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="751d12b1-5d6e-48f7-9feb-1a81a74eb8f1" containerName="registry-server" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.257051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.259479 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.373721 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.373852 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.373912 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtsl\" (UniqueName: \"kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.476043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.476163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.476199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtsl\" (UniqueName: \"kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.477275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.477272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.505199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtsl\" (UniqueName: \"kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl\") pod \"community-operators-lk8d5\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.597298 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:03 crc kubenswrapper[4755]: I1006 09:53:03.893158 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:53:03 crc kubenswrapper[4755]: E1006 09:53:03.894144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:53:04 crc kubenswrapper[4755]: I1006 09:53:04.183307 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:04 crc kubenswrapper[4755]: W1006 09:53:04.190170 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffd276d_a507_465b_86f5_6e7f4e38ae2c.slice/crio-deb1da2a85ed8c7d2ed99c4d4768f28ca0961dced37e1c15e5586e025e978202 WatchSource:0}: Error finding container deb1da2a85ed8c7d2ed99c4d4768f28ca0961dced37e1c15e5586e025e978202: Status 404 returned error can't find the container with id deb1da2a85ed8c7d2ed99c4d4768f28ca0961dced37e1c15e5586e025e978202 Oct 06 09:53:04 crc kubenswrapper[4755]: I1006 09:53:04.445809 4755 generic.go:334] "Generic (PLEG): container finished" podID="9ffd276d-a507-465b-86f5-6e7f4e38ae2c" containerID="008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223" exitCode=0 Oct 06 09:53:04 crc kubenswrapper[4755]: I1006 09:53:04.445888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerDied","Data":"008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223"} Oct 06 09:53:04 crc kubenswrapper[4755]: I1006 09:53:04.445929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerStarted","Data":"deb1da2a85ed8c7d2ed99c4d4768f28ca0961dced37e1c15e5586e025e978202"} Oct 06 09:53:04 crc kubenswrapper[4755]: I1006 09:53:04.448399 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:53:05 crc kubenswrapper[4755]: I1006 09:53:05.462667 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerStarted","Data":"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e"} Oct 06 09:53:07 crc kubenswrapper[4755]: I1006 09:53:07.491278 4755 generic.go:334] "Generic (PLEG): container finished" podID="9ffd276d-a507-465b-86f5-6e7f4e38ae2c" containerID="730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e" exitCode=0 Oct 06 09:53:07 crc kubenswrapper[4755]: I1006 09:53:07.492268 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerDied","Data":"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e"} Oct 06 09:53:08 crc kubenswrapper[4755]: I1006 09:53:08.509667 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerStarted","Data":"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4"} Oct 06 09:53:08 crc kubenswrapper[4755]: I1006 09:53:08.559434 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lk8d5" podStartSLOduration=2.08394268 podStartE2EDuration="5.55940385s" podCreationTimestamp="2025-10-06 09:53:03 +0000 UTC" firstStartedPulling="2025-10-06 09:53:04.448071335 +0000 UTC m=+5441.277386539" lastFinishedPulling="2025-10-06 09:53:07.923532455 +0000 UTC m=+5444.752847709" observedRunningTime="2025-10-06 09:53:08.544167779 +0000 UTC m=+5445.373483063" watchObservedRunningTime="2025-10-06 09:53:08.55940385 +0000 UTC m=+5445.388719074" Oct 06 09:53:13 crc kubenswrapper[4755]: I1006 09:53:13.597480 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:13 crc kubenswrapper[4755]: I1006 09:53:13.598270 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:13 crc kubenswrapper[4755]: I1006 09:53:13.678470 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:14 crc kubenswrapper[4755]: I1006 09:53:14.689526 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:14 crc kubenswrapper[4755]: I1006 09:53:14.766074 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:16 crc kubenswrapper[4755]: I1006 09:53:16.619968 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lk8d5" podUID="9ffd276d-a507-465b-86f5-6e7f4e38ae2c" containerName="registry-server" containerID="cri-o://61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4" gracePeriod=2 Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.321156 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.401867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content\") pod \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.402020 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities\") pod \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.402226 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxtsl\" (UniqueName: \"kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl\") pod \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\" (UID: \"9ffd276d-a507-465b-86f5-6e7f4e38ae2c\") " Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.403721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities" (OuterVolumeSpecName: "utilities") pod "9ffd276d-a507-465b-86f5-6e7f4e38ae2c" (UID: "9ffd276d-a507-465b-86f5-6e7f4e38ae2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.409794 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl" (OuterVolumeSpecName: "kube-api-access-jxtsl") pod "9ffd276d-a507-465b-86f5-6e7f4e38ae2c" (UID: "9ffd276d-a507-465b-86f5-6e7f4e38ae2c"). InnerVolumeSpecName "kube-api-access-jxtsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.460545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ffd276d-a507-465b-86f5-6e7f4e38ae2c" (UID: "9ffd276d-a507-465b-86f5-6e7f4e38ae2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.504674 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxtsl\" (UniqueName: \"kubernetes.io/projected/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-kube-api-access-jxtsl\") on node \"crc\" DevicePath \"\"" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.504724 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.504747 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffd276d-a507-465b-86f5-6e7f4e38ae2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.638999 4755 generic.go:334] "Generic (PLEG): container finished" podID="9ffd276d-a507-465b-86f5-6e7f4e38ae2c" containerID="61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4" exitCode=0 Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.639065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerDied","Data":"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4"} Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.639179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk8d5" event={"ID":"9ffd276d-a507-465b-86f5-6e7f4e38ae2c","Type":"ContainerDied","Data":"deb1da2a85ed8c7d2ed99c4d4768f28ca0961dced37e1c15e5586e025e978202"} Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.639220 4755 scope.go:117] "RemoveContainer" containerID="61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.639105 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk8d5" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.681090 4755 scope.go:117] "RemoveContainer" containerID="730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.704368 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.714113 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lk8d5"] Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.723998 4755 scope.go:117] "RemoveContainer" containerID="008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.802358 4755 scope.go:117] "RemoveContainer" containerID="61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4" Oct 06 09:53:17 crc kubenswrapper[4755]: E1006 09:53:17.802836 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4\": container with ID starting with 61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4 not found: ID does not exist" containerID="61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.802882 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4"} err="failed to get container status \"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4\": rpc error: code = NotFound desc = could not find container \"61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4\": container with ID starting with 61a00f7a40d05e2e22ed39d1751762571ff91c4ef395f27221988aa9350869e4 not found: ID does not exist" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.802915 4755 scope.go:117] "RemoveContainer" containerID="730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e" Oct 06 09:53:17 crc kubenswrapper[4755]: E1006 09:53:17.803613 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e\": container with ID starting with 730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e not found: ID does not exist" containerID="730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.803679 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e"} err="failed to get container status \"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e\": rpc error: code = NotFound desc = could not find container \"730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e\": container with ID starting with 730dc9c98a7c24c0c31a6a3b383d3051295aab66a02b445dfe6387e9a5600b0e not found: ID does not exist" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.803730 4755 scope.go:117] "RemoveContainer" containerID="008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223" Oct 06 09:53:17 crc kubenswrapper[4755]: E1006 09:53:17.804161 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223\": container with ID starting with 008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223 not found: ID does not exist" containerID="008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.804200 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223"} err="failed to get container status \"008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223\": rpc error: code = NotFound desc = could not find container \"008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223\": container with ID starting with 008aa77680c204d0b6d70661a453fe16dee7c3af5b1b310f6ada2e951af9a223 not found: ID does not exist" Oct 06 09:53:17 crc kubenswrapper[4755]: I1006 09:53:17.896946 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffd276d-a507-465b-86f5-6e7f4e38ae2c" path="/var/lib/kubelet/pods/9ffd276d-a507-465b-86f5-6e7f4e38ae2c/volumes" Oct 06 09:53:18 crc kubenswrapper[4755]: I1006 09:53:18.881321 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:53:18 crc kubenswrapper[4755]: E1006 09:53:18.882728 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:53:31 crc kubenswrapper[4755]: I1006 09:53:31.881007 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:53:31 crc kubenswrapper[4755]: E1006 09:53:31.882371 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:53:43 crc kubenswrapper[4755]: I1006 09:53:43.730128 4755 scope.go:117] "RemoveContainer" containerID="48db72d5de3ba3487536b78650c402552fcbb2402066b811d9a825c7627c5c2f" Oct 06 09:53:44 crc kubenswrapper[4755]: I1006 09:53:44.879437 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:53:44 crc kubenswrapper[4755]: E1006 09:53:44.880222 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:53:55 crc kubenswrapper[4755]: I1006 09:53:55.879943 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:53:55 crc kubenswrapper[4755]: E1006 09:53:55.880840 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7" Oct 06 09:54:10 crc kubenswrapper[4755]: I1006 09:54:10.879214 4755 scope.go:117] "RemoveContainer" containerID="56a3f1721acafbd5c0815931eb68d66871ea32258fdb653887206a80ffdc0226" Oct 06 09:54:10 crc kubenswrapper[4755]: E1006 09:54:10.880888 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfqsq_openshift-machine-config-operator(854f4c9e-3c8a-47bb-9427-bb5bfc5691d7)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfqsq" podUID="854f4c9e-3c8a-47bb-9427-bb5bfc5691d7"